Hacker News new | past | comments | ask | show | jobs | submit login

> To me TikTok was the first proof that recommendation algorithms can work.

This sounds great at first. Now imagine you are not just into wood working, indie bands or travel logs, but instead slightly interested in right wing or islamist ideology. Within a short time you are flooded with political or religious propaganda. In Europe that has been a real problem. See for example https://www.derstandard.at/story/3000000231964/auf-tiktok-re... or https://www.tagesschau.de/inland/innenpolitik/tiktok-afd-100... (Sorry, German only.) The right-wing AfD politician Maximilian Krah became so "popular" on TikTok that the platform had to artificially limit his reach! (Kudos to them, but it shows the extent of the problem.)

To be clear, FB and YT have the same problem of creating filter bubbles, but the algorithm is less effective and therefore less dangerous (but still dangerous enough!)




I actually find the opposite is frue. YouTube constantly recommends more of the same of anything I watch, so watching one extremist or even extremist-adjacent video means I will get flooded.

What's so good about tiktok is that it keeps my interests thoroughly mixed. I'm bilingual and I see content from multiple countries about different interests and it keeps me in touch with all of them plus presenting topical and trending content. It also seamlessly measures my interest so if I naturally skip a couple of videos about a topic I'll see less and less until I see none.


> YouTube constantly recommends more of the same of anything I watch, so watching one extremist or even extremist-adjacent video means I will get flooded.

That's true. I got pretty frustrated by YT's recommendation algorithm. The front page got pretty bad and repetitive. However, there's always some good stuff in the right column when you select "similar".

But you know what you can also do? Actively search for stuff! I wouldn't feel comfortable putting my media consumption behavior into the hands of some addictive algorithm. (HN is already bad enough :)

> It also seamlessly measures my interest so if I naturally skip a couple of videos about a topic I'll see less and less until I see none.

Sure, but while you are interested it keeps feeding you the same stuff, like YT on steroids. This is all fine when it comes to hobbies, music, travel logs, etc., but it gets dangerous with other content. People don't really think "I'm not really interested into this right wing or IS propaganda videos anymore, I'll give it a break".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: