Hacker News new | past | comments | ask | show | jobs | submit login

This is actually a great idea for improving the reliability of answers from a language model. I have also had trouble with GPT3 hallucinating knowledge. One could automate asking the model for a reference and then checking if the reference exists, then changing the prompt and demanding another answer if it doesn’t, until it does. I will explore this method. Thanks again!



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: