Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The app literally makes an algorithmic feed of content that you personally "like". I haven't seen anything like this at all. Mostly dancing and singing people, and comedic sketches.


No, it starts out by showing you popular, random or slightly profiled content ("you appear to be a male in the 40s, so let's start here"), and then just maximizes the content types you were shown that would increase screen time.

This has nothing to do with your preferences/what you like. It doesn't care if you feel disgust, as long as the app stays open.

Suggestion/auto feeds are designed to abuse human psychology to maximize time wasted, not interest.


If you like, watch things all the way through, or comment on, then it tries to show you more things like that. If you skip past content, or mark it as not interested then it tries to show you less things like that.

So if you are disgusted with something. Stop Watching It!

Yes this does optimise for time spent watching, but interest definitely plays into it.


People don't read news because they enjoy hearing about crime and corruption. They don't read about the Peter Madsen Murder case because they enjoy the thought of chopped up people.

Blaming the user for engaging by subconscious impulse in a machine designed with every psychological trick in the book to force you to do just that, to optimize screen time at any cost, is just wrong. Assigning interest, joy, or even conscious intent to that engagement is very, very wrong.

The only thing you can do is remove the app, but it won't take long before you run into the next abusive machine. And quite frankly, as with any other intentionally addictive interaction, it is not easy to realize the problem and remove it.


Which is a bit like telling a heroin addict to just Stop Doing Drugs!

At our core, we're still stupid monkeys. We cannot help ourselves on an individual level. The platforms must take this responsibility, there is no other way.


One thing that's clear is that these algorithms are here to stay forever. What's not clear yet is what we should do to cope with it. Culture emerges to help people deal with problems - so what cultural factors will emerge to protect us?

I'm starting to think the answer is a sort of "feed hygeine" that we will all have to learn to keep up with. I successfully made my facebook feed a very calm place by unfollowing toxic friends, hiding posts that made me angry, and joining positive and productive groups.

Maybe we'll be expected to teach our kids how to do this. Maybe it will become part of mainstream internet culture that we know how to curate our own feeds through our activity.


In other words, it shows you more of what you like to watch, not more of what you'd like to like to watch.


No, it shows what you do watch. If you are paralyzed it tries to paralyze you more.


The algorithm does not know that you're watching horrified or enjoying the video. They can just tell that you stopped scrolling for an extra amount of seconds.


It is nonsensical to impute anything about a person's character from what shows up in their feed, a result of black-box algorithms. This feels like the beginning of a witchhunt.


That sounds like YouTube.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: