Hacker News new | past | comments | ask | show | jobs | submit login

We are, but the blame has to be shared, too. In many cases the algorithm is being reinforced by your and everyone's actions. It may not even necessarily be specifically trying to prioritize polarizing content: that just might be the content you engage with the most, and the algorithm blindly follows your whims and preference.

I've used YouTube for many hours per day for several years. I've almost never seen a single thing appear on my home page that was polarizing or even clickbaity. The very rare times I do (always after I watch something that's kind of adjacent), I just click "Not interested" and I never see it or anything like it again. It's done a pretty good job of predicting what I would and wouldn't be interested in.

Same with Twitter. I just unfollow anyone I find tweeting polarizing or charged things. My Twitter feed looks pretty close to the HN front page.

Many people crave these things, whether they want to or even realize it. I think it's going to be this way for a very long time.




Sorry but this is a very naive view of human psychology. The Social Dilemma on Netflix does a good job of explaining why asking people to just exert more willpower is not the solution.


Maybe it is. I just am repulsed by stuff like that and try to get it out of my sight whenever possible. I figure most of HN is the same way. It pretty much never appears on any of my feeds, since I don't hesitate to click the "Not interested" button on the rare occasions it appears.

I'll watch that film, though.


The new AI is different than the algorithms of old. The old algorithms are one size fits all and everyone get exactly the same schlock. The new AI knows you individually and shows you as an individual the kinds of things that you are most likely to engage with. It doesn't matter at all if the kinds of things you as an individual are more likely to engage with have less inflammatory headlines. The system is still doing the same thing to you as it's doing to everyone else.


Right. But that was exactly what I was saying in my initial post (above my previous reply).


That is, until you click even accidentally on a single outrage clickbait video, then youtube would immediately suggest at least 20 other videoa of the same kind and you have to click “not interested” to videos for several weeks until your feed is sane again.

And IME, it doesn’t even have to be videos watched by most of the people, but a subset of “power users” who binge certain kind of videos are over represented on the recommendation engine.


>That is, until you click even accidentally on a single outrage clickbait video, then youtube would immediately suggest at least 20 other videoa of the same kind and you have to click “not interested” to videos for several weeks until your feed is sane again.

For me it takes a single round of a few "Not interested" clicks. Definitely not weeks, or even days.


No, it’s being driven by the lowest common denominator for engagement, which is distinct from personal preferences.

For a self-contained example, spam in large Facebook groups always rises to the top, because many people comment asking it to be deleted, causing the algorithm to show it to more people, some of which comment, until a moderator finally kills the post.

These kinds of side effects do not happen in a voting based system or a straight chronological feed.


For Facebook groups, yes, this is a big problem. That's one reason why I don't use Facebook. For one's personal feed, I don't think the same issue necessarily applies.


> just might be the content you engage with the most, and the algorithm blindly follows your whims and preference

No, it's the content that other people engage with. Disregarding the whole "engagement is a good metric of how much you want to see something" bullshit, if you served me food based on what other people like to eat, it would be a weird mix of gluten free vegan stuff, and also coke, pizza and doritos.

I don't want any of that shit. I'm not "many people". I don't need to be fed the same irrelevant garbage as them. But the only way to achieve that is to unfollow everyone and not get many useful updates. Which is what I'm doing, but it's just barely useful, and the popular crap still seeps in at every opportunity.


That just hasn't been my experience with Twitter or YouTube. (I don't use Facebook or other things.) I agree that would be very annoying if it happened to me, but for some reason it just doesn't seem to. I probably watch so many videos that it's just picked up my preferences well.

I do want to see what other videos are watched by large numbers of people who've watched things I've already watched. That's how I discover new and interesting things. I've found tons of great content that way, and pretty much no clickbaity or LCD / popular crap seeps in (maybe once every few months, but the "Not interested" immediately takes care of it). I don't know how common my experience is.


Nobody likes to eat gluten-free stuff; unfortunately, some people like me have to eat it.


Joe Edelman wrote a nice article about algorithms that are driven by metrics:

"This talk is about metrics and measurement: about how metrics affect the structure of organizations and societies, how they change the economy, how we’re doing them wrong, and how we could do them right."

https://medium.com/what-to-build/is-anything-worth-maximizin...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: