Hacker News new | past | comments | ask | show | jobs | submit login

Your mind runs on biology, which only pauses for sleep, death, and theoretically cryonics.

Practically, cryonics is currently just fancy death in a steel coffin filled with liquid nitrogen; but if it wasn't death, if it was actually reversable, then your brain literally frozen is conceptually not that different from an AI* whose state is written to a hard drive and not updated by continuous exposure to more input tokens.

* in the general sense, at least — most AI as they currently exist don't get weights updated live, and even those that do, the connectome doesn't behave quite like ours






Yeah, if an LLM is alive/conscious, it could only be so for the course of inference, so using chatgpt would be some kind of horrific mass murder, not a conversation with another sapient entity.

> so using chatgpt would be some kind of horrific mass murder

We don't yet have useful words for what it is.

If the contexts are never deleted, it's like being able to clone a cryonics patient, asking a question, then putting the clone back on ice.

Even if the contexts are deleted but the original weights remain, is that murder, or amnesia?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: