Hacker News new | past | comments | ask | show | jobs | submit login

How do I know the LLM isn't lying to me? AIs lie all the time, it's impossible for me to trust them. I'd rather just go to the actual source and decide whether to trust it. Odds are pretty good that a programming language's homepage is not lying to me about the language; and I have my trust level for various news sites already calibrated. AIs are garbage-in garbage-out, and a whole boatload of garbage goes into them.





They could provide verbatim snippets surrounded by explanations of relevance.

Instead of the core of the answer coming from the LLM, it could piece together a few relevant contexts and just provide the glue.


They do this already, but the problem is it takes me more time to verify if what they're saying is correct than to just use a search engine. All the LLMs constantly make stuff up & have extremely low precision & recall of information

I don't understand how that's an improvement over a link to a project homepage or a news article. I also don't trust the "verbatim snippet" to actually be verbatim. These things lie a lot.

> How do I know the LLM isn't lying to me?

How do you know the media isn't lying to you ? It's happened many times before (think pre-war propaganda)


We’re talking about the official website for a programming language, which has to reason to lie.

>Odds are pretty good that a programming language's homepage is not lying to me about the language

Odds are pretty good that, at least for not very popular projects, the homepage's themselves would soon be produced by some LLM, and left at that, warts and all...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: