Hacker News new | past | comments | ask | show | jobs | submit login

Lie? I don't take kindly to such words, because you're ascribing malicious intent where there is none. Please check your tone... HN comments are about assuming the best in everyone.

This is only applying to photos uploaded to iCloud. Every single thing talks exactly about that, including the technical details: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

The hash matching is occurring on device, but only for iCloud photo images:

> Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines whether there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.

Read that PDF. You'll see everything in it is designed for iCloud photos only.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: