Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ChatGPT does not think.


Are you sure? How would you even know!?

This is like a machine intelligence casually dismissing a brain as not being able to properly think, since meat can't compute.[1]

Don't confuse the substrate or the training method with the result.

[1] Obligatory reference: https://www.mit.edu/people/dpolicar/writing/prose/text/think...


I will wait until machine intelligence will have any symptoms of being conscious, then it will be able to "dismiss" anything. Until then, it's just another mathematical model, used recently for several new hyped schemas of making money.


What are these symptoms of being concious? I can finetune an LLM to say it's concious, but I don't think anyone would find that convincing.


Hell i'm not even sure you or I think. This is the problem with referring to subjective perception when we can't be sure we're even referring to the same thing. A "thought" might just be a linguistic quirk of two meats exchanging information.


Cogito ergo sum... I think more or less the only thing we can be sure of is that we think. However, I don't think we REALLY know what thinking means, and whether the way we think is categorically different from how an LLM thinks.


I've always felt that "sum" was sufficient and self-evident. I see no reason to drag "cogito" into the conversation, let alone "ergo".


Self evident perhaps, but sufficient? I think that depends on a wider context of what the question you’re trying to answer is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: