Hacker News new | past | comments | ask | show | jobs | submit login

No. LLMs do not have memory like that (yet).

Your 'scary' examples are very hypothetical and would require intentional design to achieve today; they would not happen by accident.




I love how burning your house down is something that deserves air quotes according to you.

All I can tell you is this: LLM’s frequently misinterpret, hallucinate and “lie”.

Good luck.


Preventing burning your house down belongs on the output handling side, not the instruction processing side. If there is any output from an LLM at all that will burn your house down, you already messed up.


I'd go as far as saying it should be handled on the "physics" level. Any electric apparatus in your home should be able to be left on for weeks without causing fatal consequences.


Im not taken aback by the current AI hype but having LLMs as an interface to voice commands is really revolutionary and a good fit to this problem. It’s just an interface to your API that provides the function as you see fit. And you can program it in natural language.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: