Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It tells more about what kind of content you are liking and watching than anything else.

This sounds very much like victim-blaming, I don't think it's appropriate to side with TikTok on this.




It's not victim blaming. It's widely known that the "magic" of TikTok is that it carefully scans your interactions with content as you consume it, and seamlessly adapts your stream to what it believes you'd like to see. Algorithmic suggestions taken up to 11, and generally as close to mind reading as we've came as a species.


It seamlessly adapts your stream to what it believes will increase “engagement“. That's a different thing to “what it believes you'd like to see”.


TikTok has seemed to present me with shock/outrage content in the name of engagement less than any other social media - far less.

I almost always see content that is only fun.

That has made it much better for long-term engagement and positive for my mental health, rather than other social media's short-term outrage engagement that is wholly negative.


That means the TikTok algorithm designers are half-way competent.


I intended this in contrast to your point above.

It could mean the algorithm designers have decided that optimising for fun is more effective at increasing engagement than optimising for shock value and conflict.

But, this is a different choice than other social media designers have made. It seems like there is some more wholesome motivation.


I don't think the other social media designers deliberately optimised for shock value and conflict. I think that was a side-effect of a naïve optimising algorithm: short term (i.e. until people just quit the social media platform entirely), conflict and doomscrolling are two of the most engaging behaviours.


Have you ever used TikTok?


I think there's a problem with this though: people aren't always in the mood for the same stuff. The easiest distinction here is 'inappropriate' content from the rest. Sometimes the user is interested in that content, but not every time they open the app. It might even be that most of the time they don't want to see that content.

I experience this with politics videos on YouTube. Sometimes I'm in the mood for that. One evening I'll watch a few politics videos. The next day, my recommendations are filled with politics. Over time my recommendations get back to normal. Then I'm struck with the politics bug again, but my recommendations don't give me an easy way to find more politics content. They're recommending the type of stuff I'd normally watch. The day after that I'm annoyed again about politics being all over my recommended lists.

It feels like I should have separate profiles based on the topics I'm interested in. The platforms don't seem to really support that though. Even different users seem to influence each others suggestions.


This may be true for people with short attention span. For me personally sure, I may click on a clickbait article or watch an attractive lady, but it doesn't mean that I want those suggestions even shown.

It's like putting a lot of chocolate with sugar in my apartment: the wise choice is to not even buy anything that contains sugar, as I can't stop myself from eating it if it's in front of me all the time.


Maybe reptilian mind reading. While we might intuitively be very attracted to sexual and gore content, it doesn't mean that's what we want.


[flagged]


What about when you see the content on first install?


Then it's a problem (for the sake of the argument I consider "the content" to be either illegal or borderline psycho).

And for the record I consider a lot of tiktok pranks to be borderline psycho (eg gaslighting your SO for giggles about divorce, abortion, etc.).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: