Hacker News new | past | comments | ask | show | jobs | submit login

The one I linked is from a Swedish conspiracy theory channel with 16k subscribers and it has been up for 2 months, gotten 40k views (huge for Swedish language content) and does not try to hide itself at all ("WakeUpGlobe SE" and uses "corona" in the title). It would be trivial for a human to find this.



You can't blanket ban terms otherwise you'd end up accidentally banning far more false positives than a few podcast apps. For example "corona" is a pretty broad term -- it would be like banning Nazi content but using the word "German" from "National Socialist German Workers' Party" as your identifier then wondering why half the German language videos disappear.

> It would be trivial for a human to find this.

Someone literate in Swedish maybe (to gather the context of those key words) but it isn't humans which do this.

Google are big into automation to the extent that they have machines doing their review. You might consider that wrong but then you have to ask yourself how many humans would it take to moderate a platform as large as YouTube. I bet you that whatever number you come up wouldn't be enough and someone else would say "I found another video that was trivial to find, Google don't hire enough platform moderators!"

Plus it's a pretty horrible job being a professional moderator and spending your whole day reviewing the dregs of society. I've read reports where people who've done it had said it's had a very real negative impact on their mental health.

As I said earlier, fixing problems like this at scale is insanely hard. It's one of those things that might seem easy at a superficial level but it's fraught with errors and you can guarantee that whatever decision the moderator makes (be that human or algorithm) someone will be unhappy and claim it's not fair.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: