Hacker News new | past | comments | ask | show | jobs | submit login
We Can’t Upload You, Sorry – Why we can’t put your mind in the machine (2021) (onezero.medium.com)
10 points by d_tr on Jan 29, 2023 | hide | past | favorite | 6 comments



The article just says what we already know: we can't upload people right now. This is obvious: there are no available systems for mind uploading, commercial or otherwise.

But why would that always be the case? And why should we assume that the reasons the author states are even true right now, much less whether they will actually impede the development of uploading? As we have plainly seen with artificial neural networks over the course of the past decade, major progress in technology can happen overnight, and that overnight progress can lead to completely unexpected new directions in research, consumer products and social consequences.

I also just don't buy the arguments to begin with. No, our brains are not static connectomes – has anyone ever seriously argued this? Obviously, our brains and our minds are dynamic. Any system we build to represent a human mind would necessarily require the capacity for dynamism and change.

Another argument form the article: we can't measure synapse strength without a barrage of tests for each individual synapse. Is this even true? And why should we believe that will remain impossible? Machine learning techniques seem like a great match for this. We need a lot more data and a lot better instruments, but those will come.

And, of course, everyone understands that you can't make a completely detailed map of anything without some form of simplification or abstraction. The first uploaded minds and the first simulations would surely be crappier than the ones that follow. This has been covered in a dozen sci-fi works: Stephenson's Fall (2019) is the most recent big name example I can think of.

We're never going to know until we try. And people are definitely going to try this... a lot. Constantly. Until the end of our civilization. So no, we can't upload anyone right now. But we should probably assume that it will happen, with varying levels of quality, before many of us die. Compared to other fantastic scenarios, like contact with aliens, human brain simulation seems like a sure bet.


Yeah, I believe that it will be possible at some point, possibly by using a combination of several techniques to extract as much information as possible.

We change as we age anyway, so some loss should be perfectly acceptable.

It is just more complex than some people might be imagining. Searching for "a connectome is not enough" on Google brings up a few interesting articles and papers including studies on C. elegans and the fruit fly.

If the goal is continued existence there are other possible ways to achieve it. I 'd say "scanning" an already dead brain will probably end up not being the most desirable one.


I think it will eventually be possible, but I don't necessarily agree that the subjective nature of life will transfer. Let's presume it's possible to physically clone a grown human. Now you have two subjective experiences occurring simultaneously, yet the "original" will only actually experience their own life from that point, the clone would be a totally independent copy, correct? So why do we think that if we could upload our mind into a machine, that our subjective experience would transfer? The mind in the machine would have all of the original life experience, etc. Anything that happened to the mind in the machine after uploading would be just as unknown to the original person, just as any other person's internal experience is.

A similar argument can be applied to Star Trek-style teleportation. The idea is that we break an organism or item into it's constituent particles, transmit those in some fashion to the target site, and then reconstruct the original object from the particles. So in essence, you're destroying and recreating objects and organisms every time they teleport. I feel like the "original" consciousness would be separated from the biological vessel, and I don't necessarily think that the same consciousness would be present on the destination side. Sure, a consciousness will be present, and it will have access to all memories and experience of the original person, so it will feel like it is the same one. However, I don't feel like it actually would be.

I have no scientific evidence or argument for all of this. Quite honestly, it's a "gut feeling" of how consciousness itself works, that was "given" to me during a particularly time-bending DMT experience, so feel free to take it with a grain of salt. Then again, I don't think that science will ever be able to provide all the answers to human existence, but it's not really designed to. It's one of the few processes we as humans have attempted to design to be as utterly resistant to bias and subjectivity as possible. It's simply that not all things can be answered through objectivity.


Most mind-uploading futurism seems to be driven by the idea that we'll achieve digital immortality--at some point before or just after death, we'll transfer our minds to a simulated being in a simulated world, or perhaps a robot in the real world. And in that light I completely agree with what you're saying--digi-me would not be the same consciousness as bio-me. There's a trivial proof of this--suppose bio-me doesn't conveniently die, and my consciousness is merely cloned. There are now two inarguably distinct beings who just happen to share the same mental history until the "fork".

But we can take this thought experiment even further--how do I know that the me who woke up this morning is the same one who went to sleep last night? Perhaps my consciousness "dies" every night and a new "instance" inhabits my body when I wake up. My sense of personal continuity comes from my memories of always being me, not from uninterrupted consciousness. To take it even further, perhaps this happens from one moment to the next while I'm awake. How would I know?

In light of that thought experiment, perhaps there's no such thing as a continuous me. That would mean that digi-me is just as legitimately "me" as bio-me, even if we exist at the same time and argue about which of us is the true me.

This is not my personal belief--I share your gut feeling that there's an immutable "real me". But I have to acknowledge that perhaps this is only because my brain is arranged so that it's impossible for me to feel otherwise.


Sounds more like a list of to-do bullet points than a square "not possible"

It is also likely that human conciousness is more robust than the author thinks. You may still be yourself if you lose some of metadata in transition; much like how you are still you during hungover, when the working of your neurons is significantly impaired.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: