Hacker News new | past | comments | ask | show | jobs | submit login

LLMs are on the way to AGI but they are still laughably bad at logical reasoning.

Any remotely interesting coding task is at least somewhat novel and requires some reasoning. So far, LLMs don't seem to be very good at handling things they haven't seen before.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: