Hacker News new | past | comments | ask | show | jobs | submit login

LLMs can't develop concepts in the way we think of them (i.e., you can't feed LLMs the scientific corpus and ask them to independently to tell you which papers are good or bad and for what reasons, and to build on these papers to develop novel ideas). True AGI—like any decent grad student—could do this.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: