Hacker News new | past | comments | ask | show | jobs | submit login

What is the hallucination rate of, for example, a Llama3 or GPT4?



They claim 50% with fine-tuning and/or(?) RAG (unclear marketing phrasing imo), and claim their method achieves 5% on the same task which is apparently a text-to-sql task set.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: