Hacker News new | past | comments | ask | show | jobs | submit login

> (It is also interesting to me that in ~2010, such discussions used to use terrorism, and now it’s CSAM/pedophilia.)

Nah, CSAM/pedophilia argument goes way back. It is just that the terrorism argument has largely disappeared because the emotional impact of mostly 9/11 has been dealt with (terrorism was much more rampant in parts of the previous century, can't recall the decade but if I had to say it was 70s or 80s).

The thing is, tools can be used for Good and Bad. A knife, a car, the Internet, social media.

The problem is if we live in a society where your online identity is pseudonymous at best (remember Rotterdam hospital shooting recently, the shooter got found on 4chan), and there is post-moderation instead of pre-moderation you are going to have garbage content.

A lawsuit like this is used to prove negligence in proportion to the amount of users vs content. Ie. it will result in more active moderation by Meta. In that regard, it is The Netherlands hosting providers who were terrible with removing CSAM. And, almost all of them have improved.

Does that mean the content is removed from the Internet? No, impossible. But it sure as hell got more difficult for the perpetrators to distribute the content. Which is a decent compromise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: