Hacker News new | past | comments | ask | show | jobs | submit login

Considering the public's persistent faith in lie detectors for humans, I don't hold a lot of hope we will be wiser about lie detectors for computers.



It's going to get worse before it (unlikely) will get better. I predict within 10 years, we'll have at least one jurisdiction experimenting with AI-generated evidence, to help prosecutors get convictions. Prosecutor will push a button and computer will spit out a sworn statement that the defendant is guilty, this will be admissible, and juries will convict based on that evidence alone.

Drug dogs who "hit" on command are already used as probable cause generators. Breathalyzers are basically magic boxes that produce "evidence" of a crime. It's inevitable that we're going to keep using AI and computers to automate convictions.


This seemed implausible to me at first, then I remembered we've already had people submitting ChatGPT-hallucinated cases in court, so maybe not :(


And that's the one we caught. 99% of the time people aren't checking citations in motions to the court. Nobody has that much time. Things are rushed. I'm sure it is happening a lot more than we think.

I use GPT to help write wiki entries for a site I'm working on and I have to be really on my game as it will hallucinate facts. I know some have got to have slipped through.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: