Hacker News new | past | comments | ask | show | jobs | submit login

That may be true in principle, but irrelevant with respect to Apple's CSAM process. Unless the exact material is explicitly catalogued by NCMEC or another child safety organisation, there won't be a hash match.

This isn't a porn detector strapped to a child detector.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: