Hacker News new | past | comments | ask | show | jobs | submit login

I suspect that a large portion of CSAM is nude selfies that might be susceptible to "nudges" in the behavioral economics sense.



This may well be true according to the legal definition, though a system that matches against a fixed list of hashes, perceptual or otherwise, rather than working on magical AI, would not catch those. However, if this is all that the measure would be effective against then this is rather at odds with the proponents' narrative which makes this out to be about pictures produced and shared by adults with criminal intent.


To hell with your "nudges". It It's a cutesy way of saying "flex of State power", and everyone effing knows it.

No. It's not appropriate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: