Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Before one may begin to understand something one must first be able to estimate the level of certainty. Our robot friends, while really helpful and polite, seem to be lacking in that department. They actually think the things we've written on the internet, in books, academic papers, court documents, newspapers, etc are actually true. Where the humans aren't omniscient it fills the blanks with nonsense.




> Where the humans aren't omniscient it fills the blanks with nonsense

As do most humans. People lie. People make things up to look smart. People fervently believe things that are easily disproved. Some people are willfully ignorant, anti-science, anti-education, etc.

The problem isn't the transformer architecture... it is the humans who advertise capabilities that are not there yet.


> As do most humans. People lie. People make things up to look smart.

The main difference is that people know when they lie and make things up to look smart, LLM doesn't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: