Hacker News new | past | comments | ask | show | jobs | submit login

LLM just are waayyy too dangerous for something like home automation, until it becomes a lot more certain you can guarantee an output for an input.

A very dumb innocuous example would be you ordering a single pizza for the two of you, then telling the assistant “actually we’ll treat ourselves, make that two”. Assistant corrects the order to two. Then the next time you order a pizza “because I had a bad day at work”, assistant just assumes you ‘deserve’ two even if your verbal command is to order one.

A much scarier example is asking the assistant to “preheat the oven when I move downstairs” a few times. Then finally one day you go on vacation and tell the assistant “I’m moving downstairs” to let it know it can turn everything off upstairs. You pick up your luggage in the hallway none the wiser, leave and.. yeah. Bye oven or bye home.

Edit: enjoy your unlocked doors, burned down homes, emptied powerwalls, rained in rooms! :)




No. LLMs do not have memory like that (yet).

Your 'scary' examples are very hypothetical and would require intentional design to achieve today; they would not happen by accident.


I love how burning your house down is something that deserves air quotes according to you.

All I can tell you is this: LLM’s frequently misinterpret, hallucinate and “lie”.

Good luck.


Preventing burning your house down belongs on the output handling side, not the instruction processing side. If there is any output from an LLM at all that will burn your house down, you already messed up.


I'd go as far as saying it should be handled on the "physics" level. Any electric apparatus in your home should be able to be left on for weeks without causing fatal consequences.


Im not taken aback by the current AI hype but having LLMs as an interface to voice commands is really revolutionary and a good fit to this problem. It’s just an interface to your API that provides the function as you see fit. And you can program it in natural language.


Chapter 4: In Which Phileas Fogg Astounds Passepartout, His Servant

Just as the train was whirling through Sydenham, Passepartout suddenly uttered a cry of despair.

"What's the matter?" asked Mr. Fogg.

"Alas! In my hurry—I—I forgot—"

"What?"

"To turn off the gas in my room!"

"Very well, young man," returned Mr. Fogg, coolly; "it will burn—at your expense."

- Around The World in 80 Days by Jules Verne, who knew that leaving the heat on while you went on vacation wouldn't burn down your house, 1872.


[flagged]


We might have different ovens but I don't see why mine would burn down my house when left on during vacations, but not when baking things for several hours.

Once warm, it doesn't just get hotter and hotter, it keeps the temp I asked for.


???


When you think about the damage that could be done with this kind of technology it’s incredible.

Imagine asking your MixAIr to sort out some fresh dough in a bole and then leaving your house for a while. It might begin to spin uncontrollably fast and create an awful lot of hyperbole-y activity.


I suggest looking up how electric motors work lest you continue looking stupid :)


I’ll just not worry myself over seemingly insane hypotheticals, lest I continue looking stupid, thank you.


I mean there is multiple people all over the main post pointing out how LLMs aren’t reliable but you do you.


All of those outcomes are already accessible by fat fingering the existing UI. Oven won’t burn your house down, most modern ones will turn off after some preset time, but otherwise you’re just going to overpay for electricity or need to replace the heating element. Unless you have a 5 ton industrial robot connected to your smart home, or have an iron sitting on a pile of clothes plugged in to a smart socket, you’re probably safe.


If it wasn't dangerous enough by default, he specifically instructs it to act as much like a homocidal AI from fiction as possible, and then hooks it up to control his house.

I think there's definitely room for this sort of thing to go badly wrong.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: