Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just wish we’d stop using words like intelligence or reasoning when talking about LLMs, since they do neither. Reasoning requires you to be able to reconsider every step of the way and continuously take in information, an LLM is dead set in its tracks, it might branch or loop around a bit, but it’s still the same track. As for intelligence, well, there’s clearly none, even if at first the magic trick might fool you.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: