> Basically if you've got a prion disease it's just a matter of time before they end you.
Unless you had a mind backup and a second body ready.
Which, I think, all medicine ultimately aims, should aim, for; Why bother learning how to repair the single same body over and over (which won't help in unexpected catastrophic failures aka fatal accidents anyway) when you can just grow and jump into a new one?
While we have thousands of years of cultural belief that mind and body are distinct, I'm not aware of any scientific evidence for that. And there is a lot of evidence that the mind is directly dependent on the state of the body--change the body (drugs, injury, etc) and inevitably the state of mind changes too.
So I don't know of any reason to think that it's even possible to separate a mind from a body, let alone move one to a new one.
There's no reason to roll over and just accept death from aging, accident or violence, any more than we accept death by flu or any other curable disease. Things which would've outright killed people 200 years ago merely cause a few days of a runny nose to us, so to speak. You cannot deny that medicine is, after all, basically the quest for immortality, and that is where it's going to end up.
You cannot copy "all" of the information stored in the human body. Briefly, this happens because the physical limitations of information storage require cooling (Landauer's principle) which requires kT×log(2) energy per bit, or about 2.5 zJ at room temperature. The brain's memory capacity is about 2.5 * 10^15 bytes (https://www.scientificamerican.com/article/what-is-the-memor...), but the amount of information which can be retrieved is much smaller than the amount of information required to fully describe the system -- in the same way that knowing the contents of my hard disk will not help you recreate my laptop. The brain weighs about 1300 grams, of which maybe 5%, or 65 grams, is protein, composed of amino acids which weigh about 0.25 zeptograms, so contains about 2.6 * 10^23 amino acids. If we assume that we need a kilobit of information per amino acid, which seems like a lowball, it would require about 650 kilojoules of energy at the absolute thermodynamic minimum (vastly more efficient than real computers) just to store the information in the human brain. The electrical power required to simulate its dynamics would be tectonic, and that's assuming perfect efficiency.
Therefore, no computer that is composed of components which interact with the electromagnetic field and which operates at a temperature comparable to Earth temperatures will ever simulate a human mind to complete accuracy. It will always be an approximation.
Copying a disk does in no way require simulating each atom of the disk, though?
• We can already clone mammalian bodies.
• I'll assume that "growing" an adult human body, with a "blank" or "default" brain, isn't too inconceivable.
• Your real, natural brain can access the information within itself.
• At the moment of "copying", we figure some way to biologically link the two brains together, and write some biological software to pull data from the source brain to the destination brain.
The brain is, as far as we can tell, more like that circuit than a disk. Information is encoded within the structure of the processor itself, rather than being one of 2^N easily defined conformations of a memory device. As such, there is no way to copy the memories of brain A into brain B -- it simply isn't a thing. It would be like saying: copy all of the stains on this shirt, but the stains are integrated into the fibers and are unique to the stitching. In particular, this statement is false:
>Your real, natural brain can access the information within itself.
It cannot, for example, I cannot fully describe how to walk, given a pair of human legs and a butt. My brain can only encode information into language in the particular ways in which it is "designed" to do so. There is no way at all for the brain to report, e.g., the number of synapses within it, even if the firings of some neurons are recorded electronically.
I say "design" in this sense because the structure of my brain is contingent on its fitness-for-purpose in a way that affected its formation, via biological evolution, similar to the way that the structure of an intentionally designed object is contingent on its fitness-for-purpose, which affected its formation as it was constructed by the entity which made it.
Well then. If copying's out of the question (for now), then I propose Plan B:
A single brain, in a secure location, remotely controlling multiple bodies (one at a time, but able to switch between them.)
• Identify all the points of contact – the I/O pins – between the brain and body.
• Identify the medium and format of communication between brain/body/senses. I think most of this has already been achieved.
• Grow a body, a shell, without a brain.
• Take your brain out and keep it alive in a vat.
• Put wireless transceivers at each I/O point on the controlling brain and the shell body/bodies.
Will anything like this be remotely (<pun) possible?
You'd have to get used to lag but in the case of a fatal accident/violence* and most diseases your brain would be safe and able to switch to a new body..
I expect something like that would be very valuable to soldiers and workers in hazardous environments, like space.
While the information stored in the disk is stored in the "analog" world (magnetic potential of parts of the platters), there's an infinite amount of states that lead to the same digital disk contents.
You can't take the digital representation and reconstruct the analog state that leads to it, since by doing the analog-to-digital conversion there was information loss.
And growing a body with a "blank" brain, what does that mean? Your conscience is absolutely intertwined with your body. If you take my mind and put it in Usain Bolt's body, that Frankenthing won't run 100m in <10s.
Okay, come on, Razengan. This thought experiment is not a difficult one.
I invent a machine that "scans" your body in such a way that I'm able to create a perfect replica of the state your body was in, nearly to the atom. Molecule, by molecule, the machine can rebuild your body, within a near infinitesimal margin of error.
I put you inside this machine, and scan your replica file, and then I save it to a massive storage device, and, when called upon I can rebuild that version of you whenever I want.
Now, after being successfully scanned, you exit the device. I blow your head off with a shotgun, and your headless corpse slumps on the floor. No witnesses.
Do you really think you'll come back to life, if I print out your replica from the file?
It is good to note that Lauder's principal is a statistical argument for degrees of freedom in a system. The Landauer principal on a statistical averaging of this effect, meaning you will sometimes be able to store a little more information.
I think you have your calculations off for the amount of energy required to store a human brain. Assume 8.6 x 10^10 neurons (more accurate number as its simple density measurement) each holding a 64-bit weight which gives us 5.5 x 10^12 bits. This multiplied by the lower bound e.g. k(1)ln2 or 9.56 x 10^-24 J, gives us 5.3 x 10^-12 J which is lower than the energy required to type a letter on this keyboard. Now even if we increase this by body temperature say 37 C we get 310.15 K or about 1.7 x 10^-8 J. This is of course a lower bound on the problem but even if you pump up the number of neurons or weight value you still get less than a joule of energy required.
There is no way in hell you're going to store the entire physical state of a neuron in anything near 64 bits. The question is not "information stored in the brain". The question is "information required to recreate the brain". In fact a single neuron has many more than 64 synapses, each of which itself has a highly intricate structure. All of the information describing this structure must be retained intact in order for the image brain to be equivalent to the substrate.
Let's do a though experiment, and say that it is possible to 'copy' a person.
We live in the future where we all use fancy teleporters. You happily live in Amsterdam and commute to the Bay Area every morning. The way the teleporter actually works is by making a quick copy, deconstructing the object and the recreating the exact copy some place else.
One day you go to the teleporter and realize it's broken. You contact customer service and are totally missing that 9 o clock meeting. Customer support says: "We're sorry but there has been a malfunctioning. You were teleported but not deconstructed properly. Could you please step back in the teleporter so we can deconstruct you?".
Body and self are intertwined in ways we don't yet fully understand. But it will be verry difficult to transfer consciousness. In the case of copying, you are for sure just dying and a new copy of you will be born.
A scarier thought even: Perhaps this is the case when we go to sleep and your consciousness dies and a new one wakes up every morning. Sure would explains why I always have a hard time falling asleep ;)
Do you consider the "you" of today to be the same as the "you" of a year ago? Or is the current "you" just some bloke who has the same memories of the old "you"? Considering how much material in the body gets replaced over time.
OK, that's a bit off-topic, but an interesting question regardless, and I've been thinking about this lately. I've finally decided on something some time ago, though my opinion on the matter could still change.
I think that we are in perpetual evolution. Our memories are perpetually updated, and we make our choices based on our experience (and probably some biochemical mechanisms that affect the decision making process, probably by changing how those memories are accessed, among others.
Aren't we dying everyday, for someone else to wake up in our place the next day? To put it more bluntly, our consciousness is interrupted for the night, a "garbage collector" running, and we "reboot" the next day. What about this night you can't remember because of alcohol? Someone was here, controlling your body, yet it wasn't you (and to repeat myself, the one writing this - me - is probably going to be slightly different than the one reading the possible replies - "me" as well - and that's OK).
What I find valuable is experience. Period. It takes a huge amount of energy to train a child, and the goal is for him to be beneficial to the society (think tribe/species if you want) in return. He will be able to pass on part of his experience to the next generation, thus improving the pool of knowledge for the society (this is a general rule, it can work with genes as well).
Thus, jumping a bit to conclusions, I value my experience and memories more than this body. And I would gladly part with it if it could mean a longer life span for them. Even existing only as an "archive". But I consider that I have still much data/experience collection I can do before passing out. And there is no way to pass this data on.
If you think about it this way, a copy is a fork, and is likely to preserve experience: that's good. And cooperation is a given if the two share this idea.
Moreover, I consider it to be a rather optimistic and cheerful way of viewing things, it can be comforting at times; And I feel like some robot apocalypse or similar is simply a non issue, as long as knowledge is preserved.
That's the thought process you can have if you think too much about issues like teleportation.
Now, I need to get back to work on my brain scanner.
Seriously though, what is your opinion on the matter, now that I exposed my current one? I would be curious if you could "share your experience" with me :P
> Aren't we dying everyday, for someone else to wake up in our place the next day? To put it more bluntly, our consciousness is interrupted for the night, a "garbage collector" running, and we "reboot" the next day.
That.. is eerily similar to something I heard or read from a character in a game/anime/book/show very recently, but the name escapes me..
Maybe it was from Tides of Numenera.. You should definitely check out that game as well as SOMA. There's also a charming-yet-mindfuck'y anime called Kaiba. They go pretty deep into the idea of consciousness, identity and free will.
By the way, is there a name or term for that, uh..idea? That our self "resets" in sleep?
This thought has crossed my minds many times! Great to see it here. I don't know if there is an experiment that could prove or disprove this theory. How does one differentiate between two 'different' selves? And should there be a difference?
A short story in this vein that I really like is "Fat Farm" by Orson Scott Card. The copied dude looks wistfully after his copy who just walked out the door, and asks "Now what?" The answer is not pleasant.
Unless you had a mind backup and a second body ready.
Which, I think, all medicine ultimately aims, should aim, for; Why bother learning how to repair the single same body over and over (which won't help in unexpected catastrophic failures aka fatal accidents anyway) when you can just grow and jump into a new one?