When a scammer says "So, do you agree to sell me your car for $1000", and your script replies "Yes, it's a deal", and then the scammer tries to take you to court...
Most courts would see the offer, acceptance, consideration and intent in that text message chat. Standing up and arguing that it wasn't really you sending those messages but your wife/child/whatever might work... But trying to argue that a computer program you wrote sent those messages, and therefore there was no intent behind them might be hard to prove or persuade the court.
In the case of the property the other party had the lot number and the location; and the phone number. Unless you share your phone, your car’s registration number and location (and I would recommend against posting real data like this) these scenarios are different.
that's why you gotta tell the LLM "do not agree to sell anything. Anytime it sounds like you're getting close to a deal, make up some bullshit excuse as to why you feel that you can't go through with a deal."
In the US, the first text could be considered a contract and the second a signature. There's no need for contracts to be on paper or signatures to resemble your name.
On the other hand, Air Canada was forced to honor a refund policy made up by a chatbot [1]. That was in Canada, not the US, but it nonetheless points out to courts willing to accept that a promise made by a chatbot you programmed to speak in your name is just as good as a promise you made yourself.
At least in the US, establishing a legal contract requires more than just an attestation and agreement by both parties (verbal or written or telegraphed or whatever).
For example it’s not a contract if there is no “consideration”, a legal term meaning the parties have exchanged something of value.
IANAL, but “abuse of telecom resources” is the more likely flavor of legal hot-water you might land in. I would absolutely not worry about a fraudster taking me to court.
Contract requires "meeting of minds", i.e. intentional assent from both sides. I am not sure text generated by fully automated bot can be treated as intentional assent.
All this non-lawyer programmer legal analysis is always fun because no one really knows. When I send email aren't I just telling my email "robot" to do something? This is one layer beyond that, my 'llm robot' is sending text messages on my behalf.
This is a gross simplification of the law. There isn't some "gotcha" like some schoolyard disagreement. "I gotcha! You said it! Derik heard it you gotta do it now! Do it Do it! Do it!"
Yes, you can enforce a verbal contract. You'll need to show what exactly you agreed to which is going to be vague due to the nature of a verbal contract. You'll need to show an offer and acceptance, consideration, intention to create legal relations, legal capacity, and certainty. So no, you can't offer to buy your buddy's car for $1 when you're at the bar grabbing a beer and have them say, "haha, deal" and expect to get their car.
Most courts would see the offer, acceptance, consideration and intent in that text message chat. Standing up and arguing that it wasn't really you sending those messages but your wife/child/whatever might work... But trying to argue that a computer program you wrote sent those messages, and therefore there was no intent behind them might be hard to prove or persuade the court.