Hacker News new | past | comments | ask | show | jobs | submit login

Wow. This could be messed up for attorneys, DCS, social workers, etc. They allude more to child pornography, but I hope it doesn't extend to physical abuse.

Those photos are usually taken on phones by spouses, doctors, schools, etc. to be passed to the above on their phone for evidence for a DNN or similar case.

Glad my kids have aged out of baby bath photos.

And those poor people who I know are going to have to provide an auditing safeguard. I hope they take care of their mental health.




In every thread about this, someone makes this same false assumption: no, Apple is not scanning for naked children or children in pain. It's generating a hash to be compared against hashes of NCMEC-verified CSAM pictures (and while some HN commenters claim the DB contains non-CSAM, that has not been verified nor ever reported on by a news publication) and it does indeed only scan photos destined for iCloud Photos (which I theorize is the only part keeping this system legal[0]).

0: https://news.ycombinator.com/item?id=28112982


>that has not been verified nor ever reported on by a news publication)

How could it be? The list is literally property of the secret police, you can't know what's on it. No one can audit it except the police themselves.


Auditing it would mean looking at CSAM, which is illegal. If it was widespread, I would expect at least one of the manual reviewers that are able to legally view the CSAM would contact major publications (under the promise of staying anonymous) and whistleblow on this issue.

: To be clear, I'd expect whistleblowing if these manual reviewers were tasked to 'accept' CSAM submissions that aren't CSAM.


You're essentially saying the FBI will audit themselves and find they did something wrong, then fix it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: