Hacker News new | past | comments | ask | show | jobs | submit login

Just because you have someone on camera committing a crime in this age of deep fakes is no guarantee that they are guilty... But it definitely provides police with a suspect to investigate further.

The most devastating tool for fighting crime is not increasing the penalties or making jail more miserable. No, the most effective way to stop crime is by drastically increasing the chances of getting caught and correctly sentenced.




I’m wholeheartedly against facial recognition in legal systems. The fact is that they don’t work well, and they specifically have a bad tendency to identify the wrong black person much more often than they misidentify white people. From [3] below:

> Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Native Americans had the highest false-positive rate of all ethnicities, according to the study, which found that systems varied widely in their accuracy.

One. Hundred. Times. The camera may say it recorded John Smith breaking into a house, but it’s incredibly likely that it was actually Ron Jones. Who’s the jury going to believe, though: John Smith who was at home with his girlfriend at the time, or the hugely expensive video system that the city just bought?

Facial recognition doesn’t work. It’s bullshit tech, and we should stop using it until we make it deliver on its promises AND decide how to deal with its ramifications.

[1] https://sitn.hms.harvard.edu/flash/2020/racial-discriminatio...

[2] https://www.wired.com/story/best-algorithms-struggle-recogni...

[3] https://www.washingtonpost.com/technology/2019/12/19/federal...


I'm afraid you have it exactly backwards. The problem isn't that it doesn't work -- the problem is that it does work. And to the extent that it isn't perfect, well, it's still improving all the time. You cited a 3-year-old article reporting on data from up to eleven(!) years ago -- an eternity in this field. Not even worth reading at this point.

The racial bias issue is still important for now, but it's fast becoming irrelevant. We should be asking ourselves where our priorities lie even if bias weren't a concern.


I think it's possible that multispectral imaging may solve the difficulties these systems have with dark skin. However even if this were resolved, I would still oppose these systems because I don't want to live in a panopticon prison society, even if all the technology works correctly and never makes a false accusation. I think such a society is aesthetically disgusting.

Furthermore, an accurate system is not the same as a morally just system. A ubiquitous system which recognizes people with 100% accuracy could be used for evil if it ever fell into the wrong hands. Such powerful systems should never be created in the first place, the juice isn't worth the squeeze.


I agree that It's good when used to fight property crimes and bad when used to stifle political dissent. But does anyone really believe we can avoid it at this point?


Butlarian Jihad is inevitable.


I agree wholeheartedly. I’m waiting for the day a legal defense team presents video counter evidence of the judge, jury members and Elvis all committing the crime the defendant is being accused of.

A video completely indistinguishable from the video presented by the prosecution. Not sure where we go from there.


In that case, we look at the bailiff, point at the judge, and say "Take em away, boys".

No we look at the primary suspect's digital footprints and analyze metadata, look for other nearby video feeds, etc. We try to see if additional evidence is corroborating.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: