Hacker News new | past | comments | ask | show | jobs | submit login

> I'm not used to programming where one of the possible error states is that the computer just straight up decides it doesn't want to do the thing I asked it to do!

Without the anthropomorphism, an unexpected error condition from an external system is not that unusual. That LLMs have both loosely specified and—barring things like the ability to set 0 “temperature”—nondeterministic behavior makes that more common than most systems you’ll interact with, sure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: