ENG New site

Advanced search

[ New messages · Forum rules · Members ]
Interuniversal travel and the fate of civilizations
HarbingerDawnDate: Monday, 23.07.2012, 23:00 | Message # 46
Cosmic Curator
Group: Administrators
United States
Messages: 8717
Status: Offline
Quote (DoctorOfSpace)
Like the Amish

I was thinking exactly that too when reading replying to your previous post! happy

Quote (DoctorOfSpace)
Personally the moment the technology is available I'd love to start replacing cells with nanobots. Just imagine becoming a cloud of sentient nanobots capable of taking on any form and not dying in the vacuum of space.

I'd gladly shed a permanent body altogether and exist as a digital mind. I could control any number of avatars as needed, whether just audio/visual representations of myself, actual robotic bodies, or other equipment. That would be very freeing, and enabling too.





All forum users, please read this!
My SE mods and addons
Phenom II X6 1090T 3.2 GHz, 16 GB DDR3 RAM, GTX 970 3584 MB VRAM


Edited by HarbingerDawn - Monday, 23.07.2012, 23:00
 
DoctorOfSpaceDate: Tuesday, 24.07.2012, 01:58 | Message # 47
Galaxy Architect
Group: Global Moderators
Pirate
Messages: 3600
Status: Offline
Quote (HarbingerDawn)

I'd gladly shed a permanent body altogether and exist as a digital mind. I could control any number of avatars as needed, whether just audio/visual representations of myself, actual robotic bodies, or other equipment. That would be very freeing, and enabling too.


But what happens when someone trips over the power cord to the PC/server your consciousness is on? tongue

Also you won't survive a digital transfer. Since your mind is built upon a physical brain the most you could do is make a copy which in the end isn't the original you.





Intel Core i7-5820K 4.2GHz 6-Core Processor
G.Skill Ripjaws V Series 32GB (4 x 8GB) DDR4-2400 Memory
EVGA GTX 980 Ti SC 6GB
 
HarbingerDawnDate: Tuesday, 24.07.2012, 03:42 | Message # 48
Cosmic Curator
Group: Administrators
United States
Messages: 8717
Status: Offline
Quote (DoctorOfSpace)
But what happens when someone trips over the power cord to the PC/server your consciousness is on?

I'm pretty sure that the systems in place to handle these things would be a bit more secure than that...

Quote (DoctorOfSpace)
Also you won't survive a digital transfer. Since your mind is built upon a physical brain the most you could do is make a copy which in the end isn't the original you.

If one day in the future we discovered how the brain worked in every detail, and were able to read the instantaneous states and patterns of activity in every neural connection, and were able to transcribe that data into a digital form with the necessary framework to support its existence and operation, then you would essentially have the ability to make an exact functional copy of a human mind. Once the mind was copied into digital form and confirmed to be working properly, then the host body (you) could be killed and disposed of, unless other arrangements were made before hand.

It's kind of like a Star Trek transporter: you basically make a copy of someone in digital form, then destroy the corporeal form, then reassemble the corporeal form in a different location from the digital copy. In some cases, two separate corporeal forms come out of the process (as was the case with Commander Riker). No one would argue that anytime someone uses a transporter that they cease to be "them". If you are duplicated perfectly, and are reconstructed elsewhere, then you are still you. The only confusion comes in when the original also still exists. If that is the case, then you become two different people. But up until the moment of the split, you were one person (had one identity).

The point is, if you make a digital copy of a human mind, it retains its identity, provided that the digital framework for that mind was not so different from the workings of an organic brain (which would be necessary if you wanted to become digital since retaining your identity is the goal).

If people could not retain their identity, then few people would be signing up to have their minds copied and uploaded.





All forum users, please read this!
My SE mods and addons
Phenom II X6 1090T 3.2 GHz, 16 GB DDR3 RAM, GTX 970 3584 MB VRAM
 
TalismanDate: Tuesday, 24.07.2012, 04:54 | Message # 49
Pioneer
Group: Users
United States
Messages: 409
Status: Offline
Quote (HarbingerDawn)
No one would argue that anytime someone uses a transporter that they cease to be "them". If you are duplicated perfectly, and are reconstructed elsewhere, then you are still you.


This is the scary part... How can we know that your conscience is actually the same one and was switched over. Of course your clone will argue NO REALLY ITS ME I PROMISE! He will feel that way, but whats to say the original isn't really you who died, and someone else who just has your memories and exact same thoughts? I can tell you I will not be the first one up to try teleportation when they make something like that where it's just a big advanced copy machine, and the original is "destroyed". cool

Also, Harb how far away do you think we are from the singularity? And what will people do for fun when we're a "singularity".





 
apenpaapDate: Tuesday, 24.07.2012, 11:09 | Message # 50
World Builder
Group: Users
Antarctica
Messages: 1063
Status: Offline
Personally I think the singularity is further away than people like Ray Kurtzweil think. There's two problems with the idea of the singularity happening soon: its reliance on Moore's law, and on the ability to read a human mind and exactly duplicate it in a machine. The problem with Moore's law is quite simply that chips don't have much more miniaturisation to go before they start running into the physical barrier of the size of atoms. At that point, making them even smaller will be impossible, as parts of them would consist of less than a single atom if they did. At this point, Moore's law will end, and computer power will stagnate or increase much more slowly. At least until we manage to make quantum computers.

As for reading the human mind or replicating it, as a student of biology I can tell that's still far off. The human brain is insanely complex, and even now we have only a rough guess of what happens where and how some things happen, and just plain don't know others. Using EEGs, it is possible to very roughly get an idea of things the person might be feeling or what point the person is concentrating, but that's all at the moment. And don't get me started on the silly idea of certain people that they will be able to make an exact simulation of the human brain by cutting up dead brains and studying the exact layout of the cells very closely.

Finally, if somehow they manage to overcome all of that at some point, there's another problem: accessing the information in the brain. EEGs can show how electric potentials run in the brain and give some superficial information, but to access all the information in the brain, taking it out will be the only option.





I occasionally stream at http://www.twitch.tv/magistermystax. Sometimes SE, sometimes other games.
 
HarbingerDawnDate: Tuesday, 24.07.2012, 12:38 | Message # 51
Cosmic Curator
Group: Administrators
United States
Messages: 8717
Status: Offline
apenpaap, I agree wholeheartedly. I'm not worried about this singularity business happening anytime soon either. And I am acutely aware of the deficiencies in our knowledge of neurology. That in particular has a long way to go before any of what we've been discussing can become reality. Medical science in general has been progressing more slowly than other realms of science and technology it seems.

Quote (apenpaap)
Also, Harb how far away do you think we are from the singularity?

Far enough. Probably won't happen within our lifetimes. But anything's possible.

Quote (apenpaap)
This is the scary part... How can we know that your conscience is actually the same one and was switched over.

If it's a perfect duplicate, then as I explained they're both "you". But you take on two separate identities after the moment of duplication. Each copy is a distinct individual, with its own identity. But they'll both identically remember everything about their lives before the duplication; their identities up to that point were the same, because they both had the same existence. It's like the example of William and Thomas Riker in Star Trek (if you haven't, see the season 6 episode of Star Trek: The Next Generation called "Second Chances").

It can be a real mind bender, try not to think about it too hard.





All forum users, please read this!
My SE mods and addons
Phenom II X6 1090T 3.2 GHz, 16 GB DDR3 RAM, GTX 970 3584 MB VRAM


Edited by HarbingerDawn - Tuesday, 24.07.2012, 12:39
 
DoctorOfSpaceDate: Tuesday, 24.07.2012, 15:37 | Message # 52
Galaxy Architect
Group: Global Moderators
Pirate
Messages: 3600
Status: Offline
Quote (HarbingerDawn)
Far enough. Probably won't happen within our lifetimes. But anything's possible.


I consider the 2040s to be within our lifetimes. If you look at technology now its already happening. I said in another thread people alive today could be alive in the next 1000 years or more, although not so human anymore.

Quote (HarbingerDawn)
If it's a perfect duplicate, then as I explained they're both "you". But you take on two separate identities after the moment of duplication. Each copy is a distinct individual, with its own identity. But they'll both identically remember everything about their lives before the duplication; their identities up to that point were the same, because they both had the same existence. It's like the example of William and Thomas Riker in Star Trek (if you haven't, see the season 6 episode of Star Trek: The Next Generation called "Second Chances").


Yes exactly but the fact remains you still have to die. A copy of you is fine and dandy but the original pattern is destroyed. Thats why I have no plans if its ever possible to upload my mind. might make memory backups though that would be pretty cool.

If no one has seen it, the Singularity ruined by lawyers
http://www.youtube.com/watch?v=IFe9wiDfb0E





Intel Core i7-5820K 4.2GHz 6-Core Processor
G.Skill Ripjaws V Series 32GB (4 x 8GB) DDR4-2400 Memory
EVGA GTX 980 Ti SC 6GB
 
HarbingerDawnDate: Tuesday, 24.07.2012, 16:35 | Message # 53
Cosmic Curator
Group: Administrators
United States
Messages: 8717
Status: Offline
Quote (DoctorOfSpace)
I consider the 2040s to be within our lifetimes. If you look at technology now its already happening.

Possible. But I don't think it's likely. Even if we do develop the technological capability to achieve singularity, there are other factors (social, cultural, political, religious) that will stall its onset. And as apenpaap stated, Moore's law will run out before then. Does the 2040's estimate for singularity take that into account? (Again, I don't think it's impossible, I just felt like asking).

Quote (DoctorOfSpace)
Yes exactly but the fact remains you still have to die. A copy of you is fine and dandy but the original pattern is destroyed.

I don't think that you'd necessarily "have" to die, but it would certainly be the default choice. I'd like to think that other arrangements would be possible, but then I live in my own ideal world where everything doesn't suck so much...

In any case, if a perfect duplicate of you is created elsewhere as in the example above, and the original is destroyed, then there is no practical difference between that and you being non-destructively transported there instantaneously, besides the rejected potential for there being multiple people. The only discrepancy is philosophical.

Quote (DoctorOfSpace)
I said in another thread people alive today could be alive in the next 1000 years or more, although not so human anymore.

I hope you're right, but I'd bet that you're wrong. I don't see medical science or neurology reaching that level within the next few decades. I wish it would, it would solve a lot of problems for me, but realistically I just can't see it. But again, I hope you're right.

Quote (DoctorOfSpace)
If no one has seen it, the Singularity ruined by lawyers

ahahahaha so win! biggrin





All forum users, please read this!
My SE mods and addons
Phenom II X6 1090T 3.2 GHz, 16 GB DDR3 RAM, GTX 970 3584 MB VRAM
 
DoctorOfSpaceDate: Tuesday, 24.07.2012, 17:19 | Message # 54
Galaxy Architect
Group: Global Moderators
Pirate
Messages: 3600
Status: Offline
Quote (HarbingerDawn)
Does the 2040's estimate for singularity take that into account?

Silicon is supposed to hit it's limit by I think ~2020

The singularity movement does take that into account though. There have always been barriers that have been hit in technology. Vacuum tubes hit a limit but computers still advanced, silicon flat processors will hit their limit but even after that computers will still improve. Once you get designs down to as small as they can get you can then use that engineering knowledge to fold them in on themselves and design faster and more efficient machines. Think of creating artificial cells which would be 3 dimensional rather than 2D flat chips. You gain another dimension and more processing power. Combine this with knowledge on biology, biological engineering, and nanotechnology and everything changes.

Quote (HarbingerDawn)
I don't think that you'd necessarily "have" to die, but it would certainly be the default choice. I'd like to think that other arrangements would be possible, but then I live in my own ideal world where everything doesn't suck so much...

In any case, if a perfect duplicate of you is created elsewhere as in the example above, and the original is destroyed, then there is no practical difference between that and you being non-destructively transported there instantaneously, besides the rejected potential for there being multiple people. The only discrepancy is philosophical.


True nature is indifferent to such things but I am talking from the point of view of the self. Your continued existence would happen for everyone else, but from your perspective you would cease to exist. But yes this brings up the whole philosophical discussion on who is the entity we consider to be "I".

Quote (HarbingerDawn)
I hope you're right, but I'd bet that you're wrong. I don't see medical science or neurology reaching that level within the next few decades. I wish it would, it would solve a lot of problems for me, but realistically I just can't see it. But again, I hope you're right.


I'm willing to bet it will happen within our lifetimes. Biotechnology will converge with nanotechnology and lead to curing of diseases, thats already happening now.

Take this for example in recent news
http://gizmodo.com/5927302....titis-c

These things will become more and more common as technology gets faster, smaller, cheaper, and over all better.

Not to mention neuroscience/biomimicry is increasing in knowledge so prosthesis will get better and cheaper. Take DARPA's prosthetic arm as an example on this
http://www.youtube.com/watch?v=bDvJEKfT5TI

They are also building more and more complex robots.
http://www.youtube.com/user/BostonDynamics

It is only a matter of time before these technologies converge to a single point. There is nothing intrinsically special about nature and life, at its most fundamental level life is just code. Life in its own regard, is a form of natural technology.

I recommend looking into some of Ray Kurzweil's talks and discussions.
http://www.youtube.com/watch?v=6XY38r9x5k4





Intel Core i7-5820K 4.2GHz 6-Core Processor
G.Skill Ripjaws V Series 32GB (4 x 8GB) DDR4-2400 Memory
EVGA GTX 980 Ti SC 6GB


Edited by DoctorOfSpace - Tuesday, 24.07.2012, 21:07
 
SolarisDate: Tuesday, 24.07.2012, 21:10 | Message # 55
World Builder
Group: Global Moderators
France
Messages: 731
Status: Offline
Very interresting discution (Even If I it's hard to Understand some part, it's good for my English!)

My point of view is that technological singularity will be greatly linked to the fate of civilizations.
We live in a world where men never knew how to live in harmony with themself, and given the historic demographic leap of the last centuries and it consequences, the question arises as to HOW the man will use these new technologies.

Already today, it is easy only for relatively few people to access health care or other comforts that technology brings, what would it be when like DoctorOfSpace said, nanorobots or else will emerge? It seems to me like it could deeply divided humans.
The financial and human investment in research are based on the motivation of men, and I'm not sure that the management of the world will then be effected by "supercomputer" capable of coordinating the needs and available resources to meet the common interests of men, or that every human will be equipped with nanobot allowing them not to eat / breathe or live eternally. ( wacko ..)
I think that our specie have primarily a long way before we reached this stage of maturity, and I fear that until then, it will only dramatically increase the "size" of inequalities, which are already outrageous.

And again, this raises the problem of morality :
Can we accept any technological change, affecting the definition of humanity?
Can we accept technological change, with too great a risk of loss of control of human society? (wiki)

The interesting thing is that for the first time we do have conscience of the different evolution possible choices and its importance seems capital.

About the technological leap itself , 2040 seems too soon to me too. ~ 2200 I would appear more real to me if i had to guess.

I've just re-read this post and it look a bit angry about technologies or mankind evolution, just to be clear, I have nothing against it, and I know someday, humans as we know won't be anymore.
It's just that history has shown us that technology is capable of the best like the worst (or is it man?), but, given the prediction of the technological singularity, the current state of the world and the path/way/system that the humans have chose... Maybe i'm too much pessimistic..

Singularity technology could lead to a bigger systemic/ideological crisis who could lead to the born of a computer capable of loving. smile


Edited by Solaris - Tuesday, 24.07.2012, 21:11
 
DoctorOfSpaceDate: Tuesday, 24.07.2012, 21:20 | Message # 56
Galaxy Architect
Group: Global Moderators
Pirate
Messages: 3600
Status: Offline
Quote (Solaris)
Can we accept technological change, with too great a risk of loss of control of human society? (wiki)


We have no choice. Technology will always get better and better, it can't be stopped because it is demanded. You always have the idea you are on the edge of something big, just one more little push and you'll have something new and better.

Quote (Solaris)
About the technological leap itself , 2040 seems too soon to me too. ~ 2200 I would appear more real to me if i had to guess.


It is not a leap. Technology is a self feeding process that keeps feeding on previous improvements. With faster computers you can design better and faster computers faster than with older tech. As you keep designing faster and more efficient machines you can keep designing them even faster and producing them EVEN FASTER.

This exponential growth is set to hit the mark of singularity by the 2040s currently, it might be later or it might be sooner depending on what happens. The earliest projected dates are in the late 2030s.

Quote (Solaris)
Singularity technology could lead to a bigger systemic/ideological crisis


I think once intelligent machines take over a true utopian society would be possible.





Intel Core i7-5820K 4.2GHz 6-Core Processor
G.Skill Ripjaws V Series 32GB (4 x 8GB) DDR4-2400 Memory
EVGA GTX 980 Ti SC 6GB


Edited by DoctorOfSpace - Tuesday, 24.07.2012, 21:22
 
curiousepicDate: Tuesday, 06.11.2012, 08:27 | Message # 57
Space Pilot
Group: SE team
United States
Messages: 141
Status: Offline
Date: Monday, 19.12.2011, 23:04

For those interested in human extinction, a new paper by Nick Bostrom on Existential Risks: http://www.existential-risk.org/concept.pdf





My ideal preferences for visual design of the mothership and technology in SE
Harry Potter and the Methods of Rationality


Edited by curiousepic - Monday, 19.12.2011, 20:04
 
Search: