How do I know the LLM isn't lying to me? AIs lie all the time, it's impossible for me to trust them. I'd rather just go to the actual source and decide whether to trust it. Odds are pretty good that a programming language's homepage is not lying to me about the language; and I have my trust level for various news sites already calibrated. AIs are garbage-in garbage-out, and a whole boatload of garbage goes into them.
They do this already, but the problem is it takes me more time to verify if what they're saying is correct than to just use a search engine. All the LLMs constantly make stuff up & have extremely low precision & recall of information
I don't understand how that's an improvement over a link to a project homepage or a news article. I also don't trust the "verbatim snippet" to actually be verbatim. These things lie a lot.
>Odds are pretty good that a programming language's homepage is not lying to me about the language
Odds are pretty good that, at least for not very popular projects, the homepage's themselves would soon be produced by some LLM, and left at that, warts and all...