Hacker News new | past | comments | ask | show | jobs | submit login

Does it matter? If LLMs aren't that, whatever it is, then we should use a different word. Finders keepers.



How do you know that LLMs “aren’t that” if you can’t even define what that is?

“I’ll know it when I see it” isn’t a compelling argument.


I think a successful high level intelligence should quickly accelerate or converge to infinity/physical resource exhaustion because they can now work on improving themselves.

So if above human intelligence does happen, I'd assume we'd know it, quite soon.


> “I’ll know it when I see it” isn’t a compelling argument.

It feels compelling to me.


they can't do what we do therefore they aren't what we are


And what is that, in concrete terms? Many humans can’t do what other humans can do. What is the common subset that counts as human intelligence?


Process vision and sounds in parallel for 80+ years, rapidly adapt to changing environments and scenarios, correlate seemingly irrelevant details that happened a week ago or years ago, be able to selectively ignore instructions and know when to disagree




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: