Hacker News new | past | comments | ask | show | jobs | submit login

While I agree it'd let the user use the system, the system should do the right thing for either situation, or at least abort and say it doesn't understand. That's the problem with LLMs so far. They can't admit they don't understand.





Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: