Hacker News new | past | comments | ask | show | jobs | submit login
Meta sued by states over harmful youth marketing (bloomberg.com)
326 points by coloneltcb 11 months ago | hide | past | favorite | 443 comments





Growing up, I knew maybe two or three kids that had emotional problems. Now (as an adult with grown kids) I hardly know any families that don’t have such a person. I’m sure part of it is we’re more open, which is great, but I can’t attribute all of it to that. The rising generation is really struggling. My guess is social media, but maybe it’s environmental pollution or something similar. I live in the US for what it’s worth.


A good book on this is The Coddling of the American Mind. It argues several factors (including social media, safteyism, and the decline in unstructured play time) are responsible for the significant rise in young adult mental health issues that started around 2012. Good read.


"Hand the kid an iPad and ignore them for 4 hours" also started around 2012.


A generation before it was just sitting kids in front of the tv…


Yeah, but there are obvious differences. A lot of people watched TV with their families or friends, although this is likely less frequent now with streaming. The amount of interactivity, portability, and solitude that the iPad affords its users with is undoubtedly bad for developing minds, at least in large doses.


TV wasn't personalized and algorithmic and brutally optimizing.


In my opinion, it's largely the result of poor economic conditions and rising inequality endemic to a broken system. Young people today are keenly aware that they will have significantly lesser quality of life than the generation that preceded them, while simultaneously having to work much harder to gain a fraction of it. Communism may have fallen over in the Soviet Union, but it looks as if Capitalism is destined to fall much the same way. It is just taking a little longer, and so, the fall will be a little harder too.


I don't know which 'young people' generation you are talking about but I don't think most of them are actively comparing themselves to previous generations. It is very difficult to accurately compare my quality of life vs say my mothers. Most people are just trying to live life and get by as best as they can. If anything, older generations belittle the younger ones with the 'you've got it easy' comments.

Communism never really took off, so it didn't have far to fall. But nothing lasts forever, so surely capitalism and democracy will give way eventually(in some ways it already has).


Communism was/still is (depends on how you feel about China) a system that billions of people lived through. I think there’s been a great amnesia about this in younger people who’ve never met someone who interacted with Really Existing Socialism


It's no accident that American colleges don't have courses dedicated to teaching students what life was like in the Soviet Union. They had money, and they had people living in nice places with cars and people who lived in ghettos and were a decade+ away from automobile ownership if ever. A reality check isn't what you need if you're raising a generation of utopian idealists.


I was born in the USSR and grew up through the restructuring following its fall. We weren't well off, and my whole family(including extended) basically lived in a 2br apartment. I don't remember feeling poor or anything, that was just the life we knew, and I don't really remember seeing people who were 'better off', but I wasn't looking for it at that age. I have recently started asking my mom about what she experienced during her work years, but that process is just starting for me.

Anywho, my reference to communism not taking off is mostly to the utopian idealogy, I understand many people live/d under communism, but it was really just socialism with some caveats.


The decade after 2012 was an enormously wealthy bull market.


Only for the people who could afford to gamble on the market. That accelerated the growth of inequality, which can be seen today.


>My guess is social media, but maybe it’s environmental pollution or something similar

There are plenty of other countries that have just as much social media usage as the US but don't have the same widespread emotional dysfunction, so wouldn't that suggest that social media isn't to blame?


Cultural stigma against mental health issues is extremely ubiquitous outside the US

This causes mental health issues to go unaddressed, but, ironically, it also helps in a way, because peer pressure certainly helps reinforce individual discipline to develop techniques to keep it together. Peer pressure is a very powerful motivator.

For example fat shaming is certainly bullying but it's also certainly effective at discouraging overeating


> fat shaming is certainly bullying but it's also certainly effective at discouraging overeating

As far as I know, it just traumatizes people and discourages them from eating in front of the people shaming them. Then, as we all do, they relieve the trauma by using their usual coping mechanism - in their case, eating.

It's just like hitting kids. It teaches them to avoid getting hit; they learn nothing about their behavior.


> As far as I know, it just traumatizes people and discourages them from eating in front of the people shaming them. Then, as we all do, they relieve the trauma by using their usual coping mechanism - in their case, eating.

But if you materially get fat you will get fat shamed regardless. So while it doesn't make you stop eating garbage, you will somehow, someway figure out how to keep a decent figure. This can either be exercise or more awful methods like surgery.


> So while it doesn't make you stop eating garbage, you will somehow, someway figure out how to keep a decent figure.

What makes you say that? Lots of people remain overweight, lots of research shows that shaming doesn't have the impact you say.

Also, lots of people can't physically do anything about it.


> but it's also certainly effective at discouraging overeating

I think you should back that statement up with a citation.


Anecdotally it did not. Just introduced self-esteem issues and if anything pushed me to emotional eating. I know a few people in the same boat.


A lot of Asian countries are shame based on the Shame - Guilt - Fear spectrum of cultures, and have some of the lowest obesity rates. I wouldn't be surprised if these countries have less obese people and more suicides of people who feel they cannot fit in


They have fewer obese people because their diet is healthier than the average US diet.


> There are plenty of other countries that have just as much social media usage as the US but don't have the same widespread emotional dysfunction

Are you sure? A quick search says otherwise. E.g Effect of social media on youth in [country].

I guess my search term is favouring a conclusion but not sure where / how to find proof of your claim.


We're getting same issues but it takes few years to propagate. I'd say US is ~ 5 years ahead of UK and another 5 years of the rest of europe.


It’s probably exacerbated here by lack of walkable cities. Public spaces for teens. Etc


The system does not care for their well being because they are redundant. Drug use was frowned upon during the Cold War because we needed people to win a industrial war against a much bigger enemy. We don't need them anymore, so the system does not care if they end overdosing in an alleyway. It is all connected, culture, social and economical.


damn. this is it. canada just passed a law saying people can get physician assisted suicide (maid) for drug problems.


I'm conflicted about it. On the one hand, I'm angry that society would simply give up and leave people behind. On the other hand, societies have always done that in one rationalized way or another, and at least this way is slightly more honest and compassionate.


Game of thrones had the wall to send unwanted men to. Looks like westeros had more compassion than canada.


It's just a fashion to have issues at the moment. It comes with living a comfortable life. It's actually a blessing in disguise but can get annoying.


Teens literally put their (usually self diagnosed) conditions in their Instagram bios.

That sort of thing used to be only something you’d share with close friends, if that. It was a big deal if someone let you know they had bipolar, depression, etc.


That's because you can self-undiagnose just as easy. People think feelings are disorders.


There's another phenomena that I have observed recently where people _want_ to be seen as "mentally ill".


> My guess is social media

Let's not scapegoat the real problems at play: climate disaster, the rise of authoritative leaders in democracies that use fear as their biggest selling point, crippling debt to merely exist in a healthy manner (in the US), disinformation turning friends and family into anti-vaxx Q-following idiots, capitalism squeezing blood out of every penny... The list goes on.


It's not social media.

Mental diseases are metabolic diseases.

It is the food that destroys our brains and even worse, our women eat shit even during pregnancy, so the brains of newborns already start out with defects.

If we were eating a proper human diet (ie beef and eggs), we would not have these issues on this scale.


Can confirm this is a bullshit theory. Eating beef and eggs did not help my depression years ago. In fact, going vegan has had a positive impact on my mental health.


Beef and eggs cured my schizophrenia!


Food quality, microplastics, other pollutants, unfiltered chemicals in the water supply, an inevitable outcome of late-stage capitalism, social media.

Take your pick.


AKA: Dozens of states' attorney generals could petition legislatures to pass laws saying under-18's are not allowed to use Facebook in their state, knowing it'd probably fail because of freedom of speech trespasses, but instead would rather sue the deep-pocket corporation for profit and reelection-time bennies.


The lawsuit asks for injunctive, not monetary, relief.


False. It asks for both injunctive and monetary relief.

On monetary relief:

On the joint COPPA claim, it asks the court to “Award the Filing States damages, restitution, and other compensation”

On the state law claims, each and every state asks for monetary relief under its own law in the form of civil penalties, damages, disgorgement, and/or restitution.


> knowing it'd probably fail because of freedom of speech trespasses

> but instead would rather

It sounds like you are implying something wrong with the very line of thinking you have presented.


If I let my kid have so much candy that they stop being able to control themselves and then they give themselves childhood obesity or diabetes is that the candy makers' faults?

If I encourage my kid to engage in the dangerous sport of snowboarding (this is an analogy) and they take it way too far despite obvious signs of egregious injury to their person is that the ski resort's fault?

No, at some point it is the parents' fault. If your kid killed themselves because the screen told them they aren't airbrushed enough their parents should be held legally liable.


A closer analogy is when cigarette companies advertised to kids, covertly or even overtly.

Parents shouldn't let their kids smoke, sure. But that's not the end of the public conversation.

It's not easy or perhaps even possible for a parent's influence to outweigh social programming coming from all around.


Indeed

So many parents are against smoking, yet naive teenagers still get addicted due to friends/coolness pressure


Candy eating doesn't control society's social fabric though. Social media does.


Good parents do not scale

It is easier to punish fb, so we should do.


The more correct analogy is advertising heavily to kids that you give out free candy, and then give them so much they get sick.

Parents can't surveil kids 24/7. It simply isn't possible. You have to have some trust that your society is a safe place for kids to grow up. Independence is an important part of growing up. Making your own decisions with or without your parent's approval.

Don't you remember being 13 and wanting to make your own decisions about your life? At some point a kid becomes an individual and they will make their own choices whether you like it or not.

Blaming parents for societal treatment of children is the same as blaming climate change on people who don't recycle. It's utter nonsense because the outside factors are massively larger than any individual.


Does this have merit? I don’t know if a single kid on Facebook. Maybe instagram. But primarily they’re all on TikTok and Snapchat.


There are about 73 million children in the US. I'm going to go ahead and bet that even though you don't personally know one who uses facebook, quite a few of them in fact do.


Probably - but just a poll of my middle schoolers class and I can guarantee the mention of Facebook will get blank stares. Insta - yes, that they’re familiar with.

What they use most is TikTok and Snapchat. My point being the harmful marketing cannot be proven if hardly any children actively use it


Are we sophisticated enough yet in this field to define addictive software, safe amounts of it, etc., as we do with drugs?


It will be surpassed by the next thing and we will forget all about the previous thing Do we define addictive programming, safe amounts of WWE, or reality TV At certain times Dungeons and Dragons, Mortal Kombat, WWE and Reality TV have been said to be causing a decline of morals. A hundred years ago we blamed comic books and radio. Did we ever figure out the safe amount of comic books or radio or did they just get forgotten as we moved onto the next shiny object?


First of all, they need to define what "harmful" is as quantifiable variable(s). Then the states in this case must prove that the contents from Meta are harmful in this context. Lastly, they need to prove there exists a statistical causal link from such "harmful" contents to the mental well-being of the children. Many parents kind of "sense" social media is harmful, but I don't see how they can "prove" that social media is really "harmful"

edit: grammar


Wouldn't you have to additionally prove that facebook had some legal duty to prevent harm? Otherwise you can't claim that it's facebook's fault, rather than the parents or the state.


Since the lawsuit is targeting "harmful content to minors", I think COPPA does give some legal duty there (not a lawyer and I'd have to look up COPPA to remember the details, so take with a grain of salt).


Cyberbullying? Eating disorders? Majority of their time on social media? Increase in medication usage? Cutting? Depression? Suicides?

There are plenty of things to measure. Just pick one and stop trying to play this ridiculous game of "what about-isms".


Ok, so you should be able to sue schools where regular bullying happens, or ads that supposedly promote unrealistic beauty standards for eating disorders, right? Or why ads, sue the media companies that publish them. Or maybe vendors that sell the magazines, or all of them - since they make up a platform like Facebook. "Majority of their time on social media" is not a harm. The rest are similar and/or not clearly linked to anything.

Really this lawsuit is just a meta for safety-ist culture that probably causes mental health issues in the first place.

It's like I spent hours per day playing soccer when I was a kid, and regularly got injured, broke my leg once. I got bullied there too on occasion, by some definition. Oh, and we used to play games where the losing team would bend over at the goal line and the winning team would try to kick the ball as hard as possible at their asses (or legs) from like 15ft - I dunno if this is bullying (I've done a lot of both bending and kicking), but I'm pretty sure someone has to think of the children...

So I guess the parents (or no, not parents - states!) shoulda sued the town that provided us with a public soccer field, cause "harms" happened there :)

EDIT: come think of it, unlike other cases - parents cannot control the other kids' behavior in school or out in the neighborhood - preventing their own children from using social media is comparatively trivial. So, there's also that - I guess if facebook is responsible for harms, all of the parents affected should be sued for neglect :)


>Ok, so you should be able to sue schools where regular bullying happens, or ads that supposedly promote unrealistic beauty standards for eating disorders, righ

If they 1) know of such bullies and 2) do nothing or even enable it, yes. I would agree with that. I don't know the legal details, but schools should have a moral duty not to promote harmful factors to kids, and to make sure kids are safe on campus.

> Or why ads, sue the media companies that publish them.

Depending on the ad, sure. Wouldn't be the first time. Ads targeting children have a ton of regulations behind them.

>Or maybe vendors that sell the magazines, or all of them - since they make up a platform like Facebook.

The magazine would be under scrutiny. I'm not sure if there are any laws against that (the national enquirer is still on my grocery store checkout line after all), so I'm guessing freedom of press overrides that. It's really just libel they worry about (feel free to look up Hulk Hogan vs Gawker for one if the few cases where the press loses such charges. Since Libel laws are very strong in the US).

>Majority of their time on social media" is not a harm. The rest are similar and/or not clearly linked to anything.

Not by itself no. The lawsuit seems to be arguing that meta knowingly deploys algorithms and utilizes data in a way that Is harmful. TBD.

>parents cannot control the other kids' behavior in school or out in the neighborhood - preventing their own children from using social media is comparatively trivial. So, there's also that - I guess if facebook is responsible for harms, all of the parents affected should be sued for neglect :)

Regardless of my opinion on that, children's anything (media, ads, internet websites) have regulations and the argument here is that meta had continually and knowingly ignored such regulation. It involves minors but summarizing down to "social media harms kids" belittles all this regulation setup over the years. It's not just about "being too addicting"


It is surprising to me that my comment is perceived as whataboutism and as if I am rooting for Meta (I do not see any such implication here). Far from it, as I do not like Meta and its product (I don't even have Facebook or Instagram app installed on my phone). I am merely saying that it is very tough to prove such case in the court as rigorous vetting is necessary and the onus to prove is on the plaintiff.


Can't pretend I read the whole claim (and even if I did there's a lot redacted), but the summary of "harm" in a hard sense comes down to

>They§constitute unfair and/or deceptive acts or practices under the state consumer protection statutes,violate COPPA, and further constitute unlawful acts under common law principles.

And given an example

>§For example, on September 30, 2021, Davis denied that Meta promotes harmful content, such as content promoting eating disorders to youth, when she testified before Congress, stating, “we do not direct people towards content that promotes eating disorders. That actually violates our policies, and we remove that content when we become aware of it. We actually use AI to find content like that and remove it. [Redacted]

There are 40+ plaintiffs and over 100 lawyers on the lawsuit so I'm not too surprised they have some basic definitions there. Will it be effective, who knows?


Right, you let your kid become an alcoholic or a nicotinist to find out whether it's harmful.


you're right! nothing matters and anyone should do anything they want!


I think you read my comment too far into something I do not implicate....


My gut feeling is that nothing will change meaningfully as long as employee promotions are dependent on increasing engagement (either directly or indirectly).

I believe that no single person sits down and says "let's make kids get addicted to our product," but they do run A/B tests or new features and keep the better performing variant. The end result of many of these cycles is the same.


It is not even about employee promotion.

You can't ban a communication tool. That is just absurd.


I worked at Meta for 7 years, focusing on Youth Engagement for a number of those years. We had incredible sociology research on what is good and healthy for kids online. We were not allowed to build any of it because the Director and VP class only rewarded "number go up" and punished any other activity. I quit the Youth group in frustration because they just wouldn't let me build or even code.


Is any of the sociology research available publicly? Very curious what ya'll wanted to build.


Do you regret having devoted so much of your time and energy to the mission of that company?


only took 7 years of collecting pay checks for you to become conscious

nothing will change until people stop enabling meta with their labor


Yeah lol

Research shows 100% of bullying is caused by humans and eliminating all human contact reduces bullying by 100%

Amazing!


With LLMs that doesn't have to be true anymore.


Nothing will meaningfully change, it's mostly a money grab I assume. Politicians and media have built up this boogieman for years and now it's time to cash in. The kids supposedly negatively affected won't see any of that money either. The social media companies just need to donate more to campaigns to make it go away. Cynical as hell, but watch, that is how it plays out.

>I believe that no single person sits down and says "let's make kids get addicted to our product," but they do run A/B tests or new features and keep the better performing variant.

Television stations make pilot programs to see if a show is any good. Intentionally making bad shows isn't a great business model. Television stations want shows people will watch, otherwise they are wasting a ton of money. I suspect social media is similar.


I remember when Facebook was restricted to folks with college domain emails. I used it for a year and decided I wasn't interested in algorithmic online stalking of people I barely knew and cared about. I'm still surprised social media turned out to be such a big hit. I guess people wanted some way to keep track of social news and the usual outlets like blogs and RSS feeds were too "nerdy" for the average internet consumer. Now we are stuck with digital panopticons and all the attendant problems associated with them. If you have kids I recommend teaching them how to properly use computational devices instead of just being passive consumers who get herded by statistical algorithms to optimize clickthrough rates.


Minor nit, but when Facebook was limited to college domain emails it didn’t have an algorithmic feed.


Social media is for socializing... If you don't enjoy socializing (stalking, drama everything), if you have no interest in those, of course you wouldnt join

Stats show people are inherently resistant to join any social media, until all their friends are already on there.

If none of your peers use social media actively or you don't keep in touch with large group of people, it wouldnt offer much value for you


Tons of people used things like alumni addresses (or were somehow loosely connected to a college) well before Facebook was opened to all emails. Except at the very beginning, membership wasn’t that exclusive.


1) Since blocking social media for one's children is trivial, does it mean all the parents affected are guilty of neglect? Is there a reverse class action lawsuit? Oh wait, that's bad for election prospects, better go after evil big businesses.

2) There really needs to be a rule for frivolous shakedown lawsuits of the kind EU and now apparently US states practice, where monetary damages are paid either way - if the fake harms and violations pan out, to the state; if they don't pan our, by the states to the company with the same order of magnitude fine.


As to 1) The prisoners dillema is a pretty basic example of how the best possible co-ordinated action is not necesarily the best possible individual action. i.e.

other kids not on social media/ your kid not on social media Δ0 happiness

other kids not on social media/ your kid on social media Δ-3 happiness

other kids on social media/ your kid on social media Δ-10 happiness

other kids on social media/ your kid not not social media Δ-50 happiness

Maybe some form of co-ordination might be in the parents best interest... maybe some form of representative could do something about it if it's in the kids best interest?


Judge: Why do you let your kids use social media?

Parent: All the other kids are on it.

...

PS There's some in between, https://support.google.com/families/answer/7103340


To reply just as flippantly: We could take the same aproach to alcohol and gambling, drop all age restrictions and put all the responsibility on the parents, then businesses will truly be able to run efficiently.

Of course good parents have to be involved in raising there kids well but that doesn't mean we shouldn't subsidide education or restrict companies from sending kids down coal mines(even if they really want the money). Mostly because good parents can fail and bad parents exist and if society isn't going to aim for turning kids into good adults then it'll just get worse generation on generation.


The approach to alcohol isn't all that different: It's entirely legal for a parent serve alcohol to their kid. If it were possible for each store to determine the parent's wish's, we'd probably have that.

Parents control the phone and the phone plan. Parental controls that exist today can set how long an app can be open per day. What's the outcome from suing Meta that's more effective?

I agree that policy should protect kids and to do that it needs to be effective, not just cathartic.


Cool, do tobacco and vape companies next.


I think you may have missed the past 30 or so years [0]

[0] https://www.fightcancer.org/news/department-justice-lawsuit-...


Tobacco ads are banned and tobacco companies are required to pay the government to make and run anti tobacco ads. Social media facing the same regulations is a reasonable end goal here I think.



The FTC is looking to hire child psychologists to evaluate the affects of these platforms, so expect more of this to come

https://www.cnbc.com/2023/10/23/ftc-plans-to-hire-child-psyc...

There's a thin line to be walked between privacy and regulation, but Facebook's own research has pointed at the harmful effects of Instagram on the mental health of children... so I think some scrutiny here is overdue.

https://www.theverge.com/2021/10/6/22712927/facebook-instagr...


> Facebook's own research has pointed at the harmful effects of Instagram on the mental health of children

Facebook’s own research has found both positive and negative effects, depending on the child’s status among their peers.

The Verge is quick to point out that:

> Instagram makes teen girls feel worse about their bodies

but more discreet in reminding people that fashion magazines, including The Cut (also published by Vox), have the same negative effects and none of the positive ones, like helping gender minority find a safe space, re-socialising people with physical handicaps or anxiety. The critique makes sense —I’m not denying it— but it doesn’t come. from “the algorithm.” It comes from having a lot of money poured into the Instagram creation ecosystem. Instagram has very little control over that as most of it is through brand deals or magazine raising their own profile. That behemoth of money and attention comes from an industry that has decades of well-documented problematic practices behind it, including the unrepentant glorification of using self-starvation and drugs, notably cocaine, to reach unhealthy thinness.

If you condemn Instagram but don’t include that industry among the culprits, you’ll have Snap with unhealthy models, TikTok with unhealthy models, or whatever comes next with unhealthy models.


How does Instagram re-socialise people with physical handicaps?


Allowing them to find friends which have similar positions/situations who understand what they live with.


Hmm that's any piece of connected software on the internet though. Shutting down instagram wouldn't compromise that.


Connected software, with very few exceptions, doesn’t attract a very large audience. For rarer situations, you aren’t going to find that person easily. Having worldwide interest groups (like subreddits, like thematic Discord channels) helps, but is conditioned to people using either Reddit or Discord. Facebook and Instagram still win compared to those two contenders.

That’s of course assuming that Reddit and Discord wouldn’t get much bigger if Meta were to close, but I’m not sure the negative effects would vanish because the owner is different.


is it safe to assume that you've worked at meta at some point? because this comes across as a little defensive

The post is about instagram so I'm talking about instagram, feel free to post an article about magazines if you want to talk about magazines... legislation aimed towards instagram will create precedence to pursue other platforms like tiktok and can also become ingrained as policy. Just because we're focusing on one subject here doesn't mean we're ignoring the others or playing favorites.


The post isn’t about Instagram at the exclusion of the fashion ecosystem: it’s an argument that truncates research that does find that posts that the drop in self-esteem is directly related to seeing posts either from fashion professionals or peers trying to emulate those. You can’t read those studies and think that detail is not part of the key findings.

Those are not two similar but separate problems.

This is like saying that mail bombs are the problem of the post office and not people having access to explosives: the problem isn’t envelopes. It’s people with access to explosives, and FedEx would likely have the same issue—unless federal rules allow them to scan and to refuse to deliver explosives.

I’m not being crass with a bad metaphor: focusing on the support has been a key issue in the argument for a while. Fashion magazines used to be decried as “glossy paper,” but no one thought the difference between magazines sending problematic self-image and serious news was that newspapers came on broad sheets of mate paper. Still, it’s now how the problem is presented: without clear separation.

I did work at Meta on the team looking at Teenagers, and I did raise that point internally: should we look at gambling and alcohol and extend the same rules to fashion? Should we boost peers over brands to avoid problematic ideation?

Those conversations, without the shadow of fashion partners, were generally productive. It wasn’t perfect: the goal remained “engagement” but there were no sacred cows to avoid.

As soon as the findings about teenage body dysmorphia were put in the context of fashion (and presented to one particular executive who cared about advertising more than algorithmic boost), that question was buried. Several friends of mine got blackballed hard not for suggesting Instagram-specific treatment, but for instance, for asking Anna Wintour about the quip where she fat-shamed her best friend in _The September Issue_ (it’s a niche fashion reference, but it’s widely considered a smoking gun in the industry).

There are inherent biases to how Instagram and Facebook work: you post when you achieve something, and there’s a bias towards success, but the internal findings rarely found that those were crippling. Thinness, on the other hand, leads to clear, widespread medical issues. I remember asking if the issue became more prevalent because teenagers didn’t have access to that many magazines before but that the dose effect was comparable. I don’t think anyone has looked into that.

I’m not saying that to disengage Meta’s responsibility: I had argued for explicit filters, like giving teenage boys who know they can’t resist and will behave in a way they think is unhealthy, the ability to exclude scantly clad women in their feed. I used another example, naturally (more typical of internal debate). That idea could have had legs ten years ago; now, with the debate being a lot less constructive, I'm not sure.


the post office did actually implement stricter screening, including x-rays, chemical detection, and GPS tracking, due to mail bombs... in a way mail bombs sped up the process of implementing package tracking systems for everyone — they took a very serious approach to safety despite not being the direct cause of the problem

from my perspective, this is the kind of regulation the FTC should seek out — so for different reasons I agree with your analogy

magazines are an entirely different beast and it's strange to not address that — I have to physically go out of my way to purchase a magazine or have one delivered... that's something I opt in to

people opt-in to instagram, but for teenagers there are much larger stakes involved... instagram has become an integral part of many teens' social life (by design) and they're constantly getting targeted and personalized ads fed to them directly

the fact that the goal remains engagement makes it pretty clear where meta's priorities lie, and it's not on the side of safety


I’m good with damming them all and getting rid of shit technology directions that hurt people, especially kids.

Who is actually ok with this?


The people making money from it; either the company itself or the employees working for it that are designing it. They clearly do not have issues with it.


lol. The meta drones showed up to downvote the comment you replied to, so I’m going to say you are spot on.


Here's the original Facebook internal research, if people want to compare and contrast the actual results with the media coverage: https://about.fb.com/wp-content/uploads/2021/09/Instagram-Te...


As someone who is very pro Free Speech and hates censorships in general, banning these toxic social media especially for Kids is a good idea and as a parent, I am all in for it. If you cannot smoke/drink till 21, you shouldn't be able to do Social media till then. It is that bad especially for children.


I'm not necessarily against putting an age limit on social media, but I can't for the life of me figure out a definition of what social media is that wouldn't be either useless for this purpose or make using the Internet impossible for kids.


I guess it could be up to the parents to decide. You could just block Facebook. Parents need to cooperate though so children that don't use all these toxic feed spyware kind of social media won't be social outsiders.


A 20 year old person is already old enough to use Social Media, smoke and drink. By 2 years at least. Y'all insane.


As someone who drunk at 18, I think it is too young. Alcohol is a drug just like coke, cannabis etc. By holding off until 21 you can reduce the amount of long term addiction.


Counterpoint: You can reduce the potential for addiction with education as well, and no time better than an early age. In Europe it’s widely accepted young adults will drink, and we can do so as young as 14 in restaurants (that’s the lowest I know).

Allowing it and not making young adults hide what they’re doing is so important. That way it can be regulated by a responsible adult, and someone on hand to make sure there’s a bucket, some water and a blanket with no danger or fear of recourse.


Chronic (as in regular, not heavy) drinking causes changes in mood, increased stress and body function. A glass of wine a day for example.

Reference: https://youtu.be/DkS1pkKpILY?si=SIn7pg3ebRGbHcXQ

Alcohol is so ingrained in popular culture that criticising it is almost taboo.

Culture will play a big part in what people under the legal age to buy consume with family. It can be a good think but I reckon once a month have a glass of wine.


>By holding off until 21 you can reduce the amount of long term addiction.

This is empirically completely and utterly false; the US has far more alcoholics than many countries with a lower legal drinking age and responsible drinking cultures.


You are confounding


> By holding off until 21 you can reduce the amount of long term addiction.

Most countries offer a counterexample.

Extreme binge-drinking to the point of harm and even death is quite particular to the US. It's a result of the prohibition on drinking, so the forbidden fruit is abused. In most contries kids drink mildly with family from 13-14 and with friends after that, but it is never seen as a big deal. So there's no pressure to go crazy at 21, or to hide it before that. When something is normal and not a big deal, it's not abused.


Yeah, but why not also ban candy, soda, junk food, video games ect?

If we can't trust parents to parent their kids then why not ban everything that possibly could be bad for kids?


Most of those things have become subject to both government regulation and industry self-policing, and there’s continual debate over whether there needs to be more done and what shape it might take.

Just getting to that point with social media will be a big win.

The social media industry came on quick and got a free pass for a while, but it’s now more mature and needs to become more mindful of its social consequences just the same as other industries. Whether that needs to come through bans and regulation or can be achieved through voluntary self-restraint is up to the big players and their egos.


By that logic I guess we shouldn't have rules against companies selling children food with ground up glass in it. If you can't trust parents to keep their kids from buying food with ground up glass in it, why not have the government ban everything that could be bad for kids right? Personally, I'm very much glad that the government doesn't allow companies to sell food to children with ground up glass in it no matter what individual parents do or don't allow and nothing about that ban on ground up glass in food means that we should also ban soda or candy.

We can absolutely look at products and services that are actually harmful and deem them unacceptable for children without banning every other thing on the planet that might possibly be bad. Some states feel that Facebook has risen to the level where it's harmful enough to do something at the government level. I think that's probably what's needed since facebook has demonstrated that they're unwilling or unable to stop the harm they're causing without government intervention.


I get your point. But we have to draw the line somewhere and Soda (even though addictive) is not something that has the same level of peer pressure and damage vector as Social Media. If all your 9 year old friends are on it, you bet you wanna be on it.

I will also argue that things like Soda can harm you but are much easier to restrict compared to Social Media access and social media has compounding effects on your mind especially if you are a child.


Parents are not police, we as a society also need to define boundaries for what is bad, parents could also prevent their kids to go to work into mines when libertarians said that it was ok, but parents are not body guards, what if some parent is not fit and their kids take heroin? Is it ok for a kid to carry the effect of a life as toxic dependent because their parents couldn't police them well enough? If a kid is born with libertarians parents, should he suffer his whole life time just because hes born to retards?

What I am saying is that why you and other corporativists think that its ok for kids to suffer if their parents aren’t able to show them a balanced way to live and approach tech?


I agree that it's a bad idea to let kids do this, but... you can still smoke/drink at home if your parents let you.


When you treat people like children, they will act like one. Putting a law in place to _try and_ stop kids from being an adult, will result in even more grown up children. Part of being an adult is dealing with stuff you don't like and handling such, reasonably maturely. Let the kids learn not to touch the hot stove.


That's really extreme. There are definitely better ways to manage it. Tiktok in Singapore has been very thoughtful in their approach which seems to be working.


What are the ways? What has Tiktok done in Singapore? What makes it thoughtful? Why hasn't Tiktok done so in America?


Extreme is my view to ban all social media as useless and dangerous means to keep people watching dangerous content endlessly just to show them advertisement s to make them feel inappropriate and buy their self confidence, probably just banning social media for under 21 is quite weak but at least is a start


"harmful youth marketing" ? Adults are just as dumb as kids, and can actually impact the rest of the world with their dumbness. We should be suing them for harmful marketing, period. I have had adult friends have literal emotional breakdowns because of social media's harmful effects.


Adults have a lot more responsibility for their actions. One of those actions is choosing to participate in a system that is widely considered harmful in multiple ways.


It's harmful for emotionally damaged people; the rest of us shouldn't have to suffer because a few people never learned that envy is a flaw to be overcome not a virtue to be cultivated.


FINALLY we're getting to the point. The problem with facebook is first and foremost that it is addictive and was engineered to be addictive. About time.


So if your app gets too "good" and users use it too much, then you have broken the law?

Interesting logic to think through.


If you’re filthy rich and become aware that your app is causing harm, and then don’t work to address it, you’ll want to start warming up the lawyers and lobbyists, yes.


What level of “rich” triggers the law?

What level of “harm” triggers the law?

Am I harming you if I make an online store and you browse it too much? How do I know what too much is? How much browsing of a social network is too much?

I don’t understand how you think this is all crystal clear :)


Think less “app too good” and more “big tobacco strategy”. Because let’s not kid ourselves: that’s exactly what these apps are trying to do – become as addictive as possible. On purpose.


Everyone wants to sell more; that's far too broad a category. Tobacco is bad because it is chemically addictive, and its delivery is paired with inhaling tar.


Yes everyone wants to sell more, but social media companies purposefully try to make their product addictive. There are entire teams of psychologists and behavioral experts figuring out how to maximize engagement through manipulating your emotional response to stimuli.

It’s similar to the superstimulus effect in junk food. The stuff is designed to hijack your instincts without providing much in return. We think we get more value out of social media (or a bag of chips) than we actually do because it’s purposefully designed to make us feel that way.

https://en.m.wikipedia.org/wiki/Supernormal_stimulus


Every company wants you emotionally hooked on their products. Ford Motor Co wants you obsessed with Ford vehicles.

Facebook has an easier time doing studies, but companies have spent decades trialing different ads, marketing campaigns, sales pitches, etc.


This is the intention of capitalism - profits go to the companies who can build products that people are most enticed to buy. A company that doesn't work to make their product loved/addictive will not continue to exist.

Another way of phrasing what you've said is that we should prevent companies from building products that people really like and want to use. No one is forcing anyone to use Facebook.


Is a constant dopamine hit not addictive?


If its delivery coated your lungs with tar, it would then be comparable to big tobacco.


If social media use has been casually linked to depression and friends, that’s kinda like coating your brain with tar.

https://mitsloan.mit.edu/ideas-made-to-matter/study-social-m...


If I gave you 10 random people to run tests on, you could tell me who the heavy social media users were? Because I could tell you who the heavy smokers were, if you gave me the same.

Social media may have its problems, but these terrible comparison to drugs is silly. It is not going convince anyone but those who already don't like social media.


Which kids use Facebook? Instagram maybe. They should possibly sue TikTok and Snapchat


Yeah, I guess if your heroine hits right and the fiends keep coming back for more, it should just be legal. I think the issue here is that these apps are designed to defeat even the little self control that humans are capable of.


hoping that all mental health organisations will join


Hmm, interesting, nothing against TikTok or the Fentanyl crisis?


I mean, various groups are pursuing action on both. The US government is a pretty big institution. It tends to do more than one thing at a time, you know.


Cool, when was the last time 33 states or more coordinated against TikTok?


I wonder how many simultaneous proceedings they have?


I'm impressed by the strong negative reactions on this thread. Maybe because this is a US-centric forum, or because I don't have kids, but I fail to see the big deal with Meta. Facebook is mostly groups with little engagements. Nowadays, people really fight on Twitter which has less moderation. Then Instagram, I thought kids were on Tik Tok?

Well, I can see how some people might be negatively impacted by this sites but it has to be marginal considering we're taking about billions of users. And what's new? the same could be said by tons of other businesses (how many people are obese with reduced life expectancy because of agressive marketing from Mac Donald's and Coca Cola for instance?). I mean, is Meta that bad?


There's an amazing subtlety here, which is that Governments are slow to function and rarely regulate well. That they're targeting Meta instead of what kids actually use (TikTok) highlights this fact.

So not only will they fail to achieve their goal, they're not even going after the right player.

But hey the cultural movement of Americans these days are to hate on big tech, and Facebook in particular. So here we go.


What’s hilarious is that they’re going after TikTok but because it’s a national threat (lol)


> the cultural movement of Americans these days are to hate on big tech

if only we could have such giants in Europe...


> with features like “infinite scroll” and persistent alerts used to hook young users

> Meta has harnessed powerful and unprecedented technologies

Are we talking about infinite scroll or heroine? It would make more sense to sue Kelloggs over Lucky Charms commercials, but we're all use to them so there's no outrage.

It's politically popular to sue big tech and especially Meta. That's the whole story.

To be clear, the lawsuit is not about the content on instagram or privacy. They're saying kids spend too much time on Instagram. Let me introduce you to the parental controls that are already on your kid's phone.


> It would make more sense to sue Kelloggs over Lucky Charms commercials, but we're all use to them so there's no outrage.

Of all the examples you might offer, it's ironic that you chose this one!

There's a TON of political history around breakfast cereal composition and advertising as well as ongoing reform campaigns and media criticism. In some cases, this has settled into formal regulations about things like what cereal boxes can look like and when advertisements can run. In other cases, it's involved collective industry "good behavior" meant to forestall regulation. But it continues to a problem because financial incentive always leads to optimizing for whatever can be gotten away with.

And yeah, social media companies are maturing as an industry and are now due to play that same perpetual game of government regulation, industry self-policing, and boundary-testing.


> due to play that same perpetual game of government regulation, industry self-policing, and boundary-testing.

Does anyone think this stuff works, though? It's an cycle of outrage and performative regulation. I'm not sure the same path in another industry is a good idea.

Lucky for parents today there's TV options without commercials.


Why does the solution have to be one-sided?

Why can't we educate parents about controls and stop Meta from intentionally making their software addictive? Heck, we could even fine Meta and have the money go to efforts to reduce the behaviors they've worked so hard to create.


> stop Meta from intentionally making their software addictive

By regulating the existence of a 'next page' button? Curious if you can think of user interface rules that are less absurd.


Is that in the article? It's paywalled, and I can't find other mentions of that.

I've assumed that the suit is over the fact that Meta intentionally and algorithmically seeks to test what content gets you to stay on their site and feed you more of that.

I also don't remember the last time I saw a Next button on Instagram or Facebook. Maybe they have one for children? But if they're still proactively encouraging addictive behaviors, I'm not sure the specific UI element matters as much.


The lawsuit calls out infinite scroll and notifications. To be fair, I've not read the entire thing.

https://storage.courtlistener.com/recap/gov.uscourts.cand.41...

Whether we're regulating UI elements or a recommendation algorithm... It's one thing to say they're "encouraging addictive behaviors", but what's a rule that identifies what what part of a mobile app needs to change? It's an unproductive path.


How do we differentiate "addictive" behaviors from "non-addictive"? Is it illegal if people like your product too much? There isn't a clear definition here of where free will comes into play.


we’re talking about dopamine production and that’s always what we wind up regulating

so, good observation. regulate aspects of these products like heroin


This has reached absurdity, everything we do releases dopamine. You would ban all activities humans do for pleasure.

If you want the make the case that IG is unhealthy you need to argue along a different axis because the problem with IG and heroine isn't that it makes dopamine.


I look forward to the absurd user interface rules that get proposed or even enshrined in federal regulation. It's an impossible task.

Alternate path: attach electrodes to kids, measure dopamine levels, and turn off the app if they go too high.

Simpler, but less fun: set the max time an app can be open.


I'm convinced that a decade or two from now we are going to look back on social media exactly like we see the tobacco industry today. The companies knew what they were selling, knew the harm, but put profit over public good.


And like tobacco, they mostly got away with it.

Same with oil companies.

Of course there are many entities doing bad things, but few have such a dramatic impact on all of civilization.


And countless employees happy to take a paycheck while looking the other way.


A fact that gets glossed over too often on HN. It's not just company leadership making these harmful applications: Everyone in the org chart down to the individual engineer is actively doing this--maybe even you, reader of this comment. Did your last "git push" contribute to the problem or help solve it? Will you be proud to tell your teenage grandchildren about what you're creating? You can't just hide behind "boss told me to do it" and wash your hands of the ethics.


I agree. We all want to hire FAANG until we really look closely at what they do. No thanks. I’m fine getting paid half to have my soul in tact.


I personally don't blame the fastfood worker for giving me a Big Mac.


Yea, all this cancer and burden on the healthcare system caused by social media. Just like tobacco.


perhaps social media hasn't caused cancer but I'd bet it's caused suicides, eating disorders, and a burden on the mental healthcare system


ah so like gq, vogue, teen magazines, mtv, hollywood - the traditional media.


exactly so, and we should be doing something about all harmful advertising to children in proportion to the harm that it's causing. If facebook is a larger problem than teen magazines then that's where our attention should be.


Will we be saying the same thing about cannabis dispensaries?


IMO no, because legal cannabis is sold with warnings plastered everywhere just like cigarettes. (Unless you think the existing disclosure doesn't properly warn users of the danger)

I bet this case would never have been brought if Facebook had a "This product is addictive" warning every time you open the app.


We know the downsides already and most think the downsides of crminalization is higher.


> We know the downsides already

In the sense that the downsides are documented in the relevant scientific literature, sure. But popular perception of those downsides is very different. A whole lot of people think those downsides are a pack of lies: that cannabis isn't habit forming, that it doesn't make the average user lazy and complacent, that it doesn't increase the risk of schizophrenia, that driving under the influence of it isn't unsafe and is probably even safer, etc. The downsides are all "DARE lies" according to many of the people who think legalizing cannabis is worth it, who weighed the harms of criminalization against 'cannabis is literally harmless.'

(FWIW I voted to legalize it in Washington in 2012.)


You started down a path that was sane then took a sharp turn.

People recognize that cannabis is habit forming, it's just not as physically addictive as the things we label "dangerous addictive drugs." Most people are addicted to caffeine and we generally accept that it's fine. In the same way that moderate drinking for almost everyone doesn't turn into alcoholism it's the same for cannabis. In contrast everyone who smokes nicotine with any regularity at all will become addicted to it.

"average user lazy and complacent" -- That's like why we're here man.

I don't know a single person who thinks it's fine to drive while high, I do know some stoners whose tolerance for "sober enough to drive" is suspect but that happens when we hit the bars too.

I think you're applying the word harmless to a degree that no one else does and then are arguing against that. Like that cheeseburger increases your risk of heart disease but no one's out here being like "cheeseburgers considered harmful."


> People recognize that cannabis is habit forming

You say that, but there's a lot of people who flat out deny it. You're saying that everybody knows this, but that isn't true. Same thing with the driving under the influence thing. A lot of people have told me that they drive safer when high because it heightens their senses and slows their roll and makes them paranoid and therefore careful, and a whole lot of cope like that invented to excuse themselves for doing it.

The downsides of cannabis that you claim everybody understands and recognizes are in fact flat out denied by many people.


No?


Personally I think the effects of social media will be overshadowed by the impact of internet pornography. More and more young men every day are lamenting the issues it causes them and more and more relationships are falling apart as a result.

Historically, though, most doomsayers are proven wrong. Let's hope we are too.


> Meta, along with Snap, TikTok, and Google, now face hundreds of lawsuits claiming they are to blame for adolescents and young adults suffering anxiety, depression, eating disorders, and sleeplessness as a result of their addiction to social media. The companies also face scores of complaints by public school districts on behalf of students alleging that the platforms have created a public nuisance.

I hope the money is worth it.


How often again do you hear billionaires say "Oh no, I wish I had been more ethical and not made mountains of cash off by shitty behavior", because I'm not having any recollections of this.

The fines are just a cost of doing business for them.


I agree the US is soft on punishing white-collar crime. But Sackler family comes to mind. And of course there's Sam Bankman-Fried. Bernie Madoff, etc.


No member of the Sackler family nor Purdue's executives ever faced prison time.


For breaking what law?

Also, how is nobody in the government responsible?

It really feels like the Sackler family has been bundled up as a nice solution to the whole thing.


No member of the Sackler's ever went to prison. I can't speak to SBF, but Madoff made the crucial error of defrauding other rich people. I bet if he'd have stuck to old people and poor people he never would've been convicted.


Bernie Madoff and Sam Bankman-Fried both made the mistake of messing with the money of other rich people.


Most businesses need to buy their raw materials to produce their products. With Meta, the raw material is consumer data, which users have been giving away to Meta freely.

The fines only slightly raise the cost of this raw material from $0, so it's unlikely to deter Meta's behavior until someone faces a more severe penalty there.


Strip mining users since 2004!


This -- just ask Microsoft with their refusal to fix default browser behavior, even in the latest versions of Win11 which are supposedly fixed. I guess having Edge be the default is just too juicy to resist even with that €561 fine. https://www.ctrl.blog/entry/windows-system-components-defaul...

It'd be great if the USA had some kind of effective consumer protection agency, but I guess then we might have fewer billionaires here.


What people think and feel in private is often not what they admit in public. For many of these individuals, there is massive incentive not to admit such things.

Doesn't make it right, but we shouldn't assume the people responsible don't feel responsible simply because we aren't hearing about it.


I mean the same holds true for what people feel about what happens to them versus what happens to others. You are measured by your external actions not the internal "well maybe I shouldn't have murdered the world" voice the rest of us don't hear.


I don't really see how this is relevant

If we don't like how something works, we should understand the factors that shape it. If someone's internal preferences are contrary to their actions, that's worth knowing, as it says something about the strength of the incentive structures. This is true for any person of any status, not least the rich, who, ideally, should not be destructive forces in society.

Private preferences and feelings are relevant to the analysis.


Kids have been having those problems well before social media came along, so it's quite a bold claim to say they know social media made them worse. But it's a fair bet to say not a dime from the lawsuit will go to a single victim. So I'd take a really skeptical look at both sides.


I'm going to call out this fallacy every time I see it. Maybe call it the dosage fallacy?

Yes, bullying has always been a problem. Yes, society has always been force-feeding kids dangerously unattainable body images. The problem is not that these things have just been repackaged into a different kind of book or TV show. The problem is that it takes much less effort than any time in history to fake a "community" all parroting the same dangerous lie.

The old adage went "If I run into one asshole today, then I ran into an asshole. If I run into 100 assholes today, then I am the asshole, and I should reflect", and it was pretty good advice. But if one asshole can create the illusion that you ran into 100 of them, what's the appropriate response. The whole problem with facebook and instagram is that it has made coordinating bullying campaigns much MUCH cheaper from a labor standpoint. Couple that with the constant drive to keep kids on the platform and you have a system that seems purpose built to drive already anxious kids to suicide.

I can 100% keep my kids off social media. But I can't keep their classmates from assembling pages impersonating them and driving bullying via that avenue. Facebook and Instagram's engagement algorithm plays a serious role in that operation.


> Kids have been having those problems well before social media came along

Suicides seem to have gone up and social media use might have something to do with it.

https://news.byu.edu/intellect/10-year-byu-study-shows-eleva...

> Girls who used social media for at least two to three hours per day at the beginning of the study—when they were about 13 years old—and then greatly increased their use over time were at a higher clinical risk for suicide as emerging adults.

https://socialmediavictims.org/mental-health/suicide/

> Over the last ten years, there has been a significant rise in the risk of teenage suicide. Although several factors play a role in an individual’s choice to take their life, recent studies have established connections between mental health issues such as depression and suicidal ideation, and social media usage.


Hyperbolic BYU press release. Link to actual paper. https://link.springer.com/article/10.1007/s10964-020-01389-6


The evolution of rates in the US is quite different from Germany and France, I would not rule out some strong confounders.


There's been a significant increase in anxiety and depression among adolescents since ~2012. It's not just business as usual, and it's significant enough that the same increase is visible in hospital admissions for self-harm. Social Media seems to be the only explanation that fits the timing and scale of the issue:

https://jonathanhaidt.substack.com/p/13-explanations-mental-...


Google is actually fairly safe due to their utter inability to build a successful social media platform.


Youtube.


Ah fair as a general risk. Not sure I'd call it social media though.


Good. For once I’ll agree with the governance in Utah and say we should ban social media for kids until they are 18.


So no kids on Hacker News? How is HN going to verify my age?


January 1st Club, Reporting In.


I used to think they'd get suspicious, so I used 03/02/01


KYC for social media


The law in Utah exists to prevent young people from communicating with like-minded peers in communities like r/exmormon or r/lgbt.

The issue is billion dollar companies manipulating people for profit, not that people are able to communicate with each other over the internet. It's entirely possible to host communities that aren't corrupted by investors who want 1000x returns on their investments.

However, the Utah law defines social media as any online forum belonging to a site with more than 5m users that allows people to register accounts and interact with others[1]:

> (9) "Social media company" means a person or entity that:

> (a) provides a social media platform that has at least 5,000,000 account holders worldwide; and

> (b) is an interactive computer service.

> (10)(a) "Social media platform" means an online forum that a social media company makes available for an account holder to:

> (i) create a profile;

> (ii) upload posts;

> (iii) view the posts of other account holders; and

> (iv) interact with other account holders or users.

The law uses broad strokes to ban much more than just social media, it bans any group or community that might belong to a parent site that has a moderately-sized user base.

In effect, the law isolates anyone under 18 who might want to find community outside of their potentially abusive or extremist parents.

[1] https://le.utah.gov/xcode/Title13/Chapter63/13-63-S101.html?...


Why is it okay for a 19 year old but not a 17 year old?

If they’re bad, they’re bad for adults, too?

If they’re not inherently bad, then they can be safely used by responsible 16 year olds as well as responsible 18/19 year olds?

We let 16 year olds operate deadly machinery traveling at many meters per second. Social media is at least less potentially harmful than automobiles.


Because at some point you decide that some threshold is useful to have, and stop sealioning on the internet about it


This is the first I’ve heard the term sealioning. Just looked it up and man, it feels like one of those foreign words that perfectly describe something that we encounter all the time. Thanks!


Doesn't seem to relate to the Wikipedia definition:

> Sealioning is a type of trolling or harassment that consists of pursuing people with relentless requests for evidence, often tangential or previously addressed, while maintaining a pretense of civility and sincerity, and feigning ignorance of the subject matter


Well, I agree it isn’t the perfect context but it isn’t wholly out of left field either, depending on the user’s interpretation of the previous poster’s quibbling over age.


I don't think so. No one was asking for facts/evidence.


I disagree that the poster you replied to was “sealioning”. It seems he was using rhetorical questions to try to make an argument. Basically he was following The Socratic Method.

https://en.m.wikipedia.org/wiki/Socratic_method

Essentially using questions to challenge assumptions in the form of argumentative dialogue.

This is a legitimate form of having a debate and it has been used for thousands of years. To compare it to the internet trolling practice of sealioning where somebody just asks for evidence repeatedly is disingenuous


Agreed. If something like "sealioning" is first order bad faith argumentation, this falsly accusing someone of doing it, hoping observers don't notice the difference, is second order bad faith.


It's a bunch of rapid-fire questions about a topic that's been well trodden, and on top of that just a bunch of whattaboutisms. It's classic sealioning.


> machinery traveling at many meters per second

Such as a bicycle. And not a car, in almost every country in the world.


Social media is also bad for adults. It's a lot harder to prevent adults from making bad choices for themselves than it is for kids.


How many 16 year olds learned to steal cars from social media?


Is this the social media's error or the car manufacturer's error? Or maybe it's really law enforcement's error?


Or maybe the parents and every other adult in the lives of these children that failed to teach them that stealing a car is wrong, no matter how easy it is?


I agree with it being a part failure on the parents. But let's be honest. Even kids raised in good homes can be shitheads or influenced by them. It makes sense why some of my friends parents didn't like their kids hanging with us. I get it now...If I had kids, I too would probably make prejudice judgments about shitty parents and wonder if I should let my kids hang with them knowing they are unsupervised or in a place with immature guardians.


The adults in these childrens lives don't have the same level of access to these kids as the people on social media. Parents are only human, they can only be 'on' so many hours of the day. The phone and the neverending stream of toxicity is always there waiting for every free moment.


I think the video stayed up too long. Normally you would learn shit like that from your older cousin who learned it from someone in Juvi or your uncles lock picking magazine. Now-a-days you can literally learn it on youtube or from the dark corners of discord chats.


Shout outs to the person who wrote this:

> My reason for reducing my social media presence is the Like count next to every thought expressed. By adding a publicly visible number next to every expressed human thought, you influence behavior and thinking.

-- https://news.ycombinator.com/item?id=19325515


Several comments say we should ban social media for people under 18. Such a blunt, indiscrimate, careless solution may indicate the typical ages of HN members!

What do some people under 18 think of the problem and solutions? I'm not sure how many are here, but maybe some HN members could ask their kids.


I'm mid 20's and it's obvious what to do. Universal child care, housing, healthcare, etc, and parents would have more time to parent their kids instead of slaving away to make someone else enough money to buy their fifth yacht or passing their childcare off to an ipad. We have worse than gilded age inequality, 9 people own more wealth than 3.6 BILLION combined. Maybe address that first??? I'd wager we'd find 99% of problems are downstream of poverty and inequality.


I'm in my late 40's and I agree with you 100%, more now than I would have at any point previously, and I certainly don't consider myself a bleeding leftie. Might be interesting to see how the "pull yourself up by your bootstraps" HN/SV crowd react to your comment.


Zero trolling: When did this become obvious to you? And why/how?


Slowly over the course of the last 48 years. There was no pivotal moment. Probably the key thing was the realisation that even with the very good wage i'm on, I was unlikely to be able to save enough deposit money to buy a house in my city without at least one of my parents passing away. And then the realisation that there are people that work far harder than me, earn far less, and understanding that they are in an even more fucked situation.


It sounds like you are in the Bay Area. Did you ever consider moving to a lower cost location? Assuming that we are talking about the US, the "flyover states" (midwest) have some insanely good value. Land and houses can be very cheap.


If you want to think about the correlation between hard work and income, watch someone operating a jackhammer and then visit a golf course.


Watch the person operating the jackhammer go to a golf course, or watch someone operating a jackhammer and then visit a golf course myself? Your sentence is ambiguous.

Are you trying to say that you think that some manual labor roles are overpaid? Why can't a jackhammer operator play golf if they want to?

In my country an entry level jackhammer operator makes about $60K and an experienced one about $100K. Doesn't sound unreasonable to me.


I meant, compare the incomes of the jackhammer operator, who is working very hard, and the people on the golf course, who are playing golf.


I mean I get your point ... rich and lazy executives often play golf, but my buddy plays golf and he's a manager for a crew at a painting company. Hardly a millionaire. Perhaps you're thinking of private golf courses which are sometimes very expensive. Public ones generally aren't.


If you look at history, the biggest changes that allow people more free time to spend with their children or otherwise, are technological ones.

Look at the mechanical loom (industrialisation), pesticides (agricultural yield increase), penicillin (life expectancy increase), painkillers (life quality increase), clean water (wide-spread infrastructure), cheap microwaves (overseas slave labour), reliable vehicles for all (fossil fuel infrastructure).

We need to continue this trend to fix the problems. The billionaires aren’t blocking us. Yes, it’s unfair that they hold so much wealth and power. The world has been that way for a long time.

When electric lightbulbs came to the masses, it made some rich people even richer, but that’s not the important part. The important part is that the masses’ lives were better.

Next steps for us:

- Find manufacturing processes that don’t require overseas slave labour

- Agricultural processes that don’t need as much pesticide

- Better, cheaper medicine (recent developments in quantum computing suggest we may be able to crack protein folding within 10 years, which will lead to an explosion in medical research)

- Automated construction (see Hadrian X)

- A replacement for fossil fuels (if we crack fusion, we can use that for the grid and we can create our own combustion juice for vehicles using the enormous amount of energy available)

- Protocells that allow extremely precise manufacturing at scale (protocells are programmed cells or completely custom cells that we can equip & program for certain tasks, like “remove all the iron ore from the ground here” — they will change everything)

These are the things that will bring the rest of the world out of poverty. And, depending on how we adapt to the irrelevancy of capitalism as these technologies develop, it may solve the inequality/unfairness problem too.

Taxing billionaires won’t accelerate any of this.


> If you look at history, the biggest changes that allow people more free time to spend with their children or otherwise, are technological ones.

There's also the 40-hour, 5-day workweek; the end of child labor, parental leave - those developments, and more like them, led to more time with children than the mechanical loom.

In fact, as technology has improved, it's possible that the proportion of children with a parent devoted full-time to them has decreased (because now families need two incomes).


People had more free time before the Industrial Revolution[1].

Increases in productivity doesn't translate into more leisure time, it translates into just getting more work done in the same time frame and being paid as if you're doing the same amount of work.

[1] https://groups.csail.mit.edu/mac/users/rauch/worktime/hours_...


>the biggest changes that allow people more free time to spend with their children or otherwise, are technological ones.

The important bit to note here is the allocation of that gain in efficiency. I agree tech is the driver of the positive sum games we call the economy, but it's important to note that on top of technology we need a political apparatus that allocates the efficiency gain. Ie a 30% increase in productivity should yield not a 30% increase in profits but maybe a 5% increase and a 25% reduction of time spent doing labor or 25% increase in wages. Obviously the numbers are variable, but I thought it important to note this too.

>The billionaires aren’t blocking us.

People hoarding more wealth than BILLIONS combined aren't blocking us?

You can seriously, earnestly think, in your heart of hearts, that hoarding that much wealth for yourself doesn't deprive some other more worthy endeavor of resources??

>When electric lightbulbs came to the masses, it made some rich people even richer, but that’s not the important part.

Not THE important part, I can agree with you. But AN important part, still. It might not be sensible to accost the inventor of the light bulb for the effect of his invention, but it should always be carefully noted the above mentioned calculation of where the efficiency gain is allocated. Did allowance of working at night, doubling the possible working hours allow any reduction of labor or increase in wages? Or did it simply increase profit?

I agree the profit itself is not the problem, the billionaires themselves aren't technically the problem, merely a symptom of an ill designed allocation of the fruits of technology. But to act like 9 people hoarding more than billions of people combined isn't detrimental to any goal other than the enrichment of those 9? I name you silly, sir.

>Taxing billionaires won’t accelerate any of this.

I mean, if we JUST tax them and then burn the tax money we get from them, sure... But if we spend the taxes on the above mentioned goals... You somehow think they wouldn't be achieved quicker???


While I agree with the general sentiment, poverty and inequality are wildly different problems (e.g. modern capitalism tends to increase inequality but also decrease poverty).


Wildly different, sure, but deeply, deeply connected which is why I include them together. There is no reason we can't decrease poverty AND inequality. In fact decreasing inequality would in turn reduce poverty if done correctly.


Agreed.


I didn’t bother asking my kids what’s their opinion regarding cigarettes.


You probably should. Many of my school friends started experimenting with tobacco and marijuana in elementary school, which were the prevalent vices of my time.

Kids are autonomous humans with free will, and despite what many adults think, they have opinions worth listening to.


Why? They’re 5 years old. I dgaf what their opinion is—smoking is bad. Period.


social media is more like crack or some other terrible drug


Really? It bankrupts its users? People can't hold a job because of it? People will rob stores to get more money to use more social media? It causes families to painfully detach themselves from addicts?


Those behaviors aren't limited to substance addiction--perhaps most well known is gambling addiction.

It's no secret that technology companies (especially social media companies) spend billions coming up with ways to keep users on their apps/services. Engagement is a palatable way to measure addiction.


it does because it eats away all valuable time and turns people into slaves and consumers instead of creators of value


It's ok to admit that you're wrong and that it's not like crack or some terrible drug.


it’s ok to have a different opinion and not lecture other people how yours is better


The problem is that we are talking about kids who could be 9 or 10 and not just 17 or just under 18. Yes there may be some 12 year olds on HN but I would doubt that you will have a 10 year old here with their opinion. There is a huge difference b/w 10 year old and a 17 year old even though I personally want to Ban Social Media until 21.


I'm 20 and all for it, and have been all for it since I thought about it at 17. I think that social media and the internet are ways of not having to develop social or emotional skills and going through school without social media would be really healthy for future generations.


i think that the deliberate design of applications to profit from or incur damage to mental health should be prohibited at state and federal levels


Behind every great fortune is a great crime…

As usual, American society superficially protects children while the most harmful perpetrators are given near-absolute access.

I really think the addictive qualities of video games should garner the amount of attention these social media companies generate. Actually, anything that takes your time and gives you nothing is suspect. As always, nothing will save you from a grift except knowledge.


> Meta, along with Snap, TikTok, and Google, now face hundreds of lawsuits claiming they are to blame for adolescents and young adults suffering anxiety, depression, eating disorders, and sleeplessness as a result of their addiction to social media. The companies also face scores of complaints by public school districts on behalf of students alleging that the platforms have created a public nuisance.

Oh I'm sure this particular lawsuit will be the one that solve these problems once and for all. Hint: it won't. This is performative politicking. Meta and co will continue to pay minuscule fines (not too much though, META is in alot of ETF and mutual funds), they'll continue to spread lobbying money around to both political parties, parents will point fingers (without taking any responsibility), young adults/teens will drone on about "mental health" because its fashionable (if social media is giving you an eating disorder or making you anxious, how about putting down the phone?), and nothing will change.


> young adults/teens will drone on about "mental health" because its fashionable (if social media is giving you an eating disorder or making you anxious, how about putting down the phone?)

To be fair, for many young adults and teens the parents are demanding the kids have their phones so that the parents can track their location and other surveillance shit.

Additionally, we know for a fact that these social media companies are hiring tams of people with full doctorate degrees in keeping you online and responding. How is a 13 year old's underdeveloped brain supposed to compete with that when most adults can't even do it?


For every parent demanding that, there's 100 kids asking for a device.


See my second point: how is a young teen supposed to handle something designed by teams of psychologists to keep them engaged in? It's why we don't let children play slot machines or don't allow tobacco advertising to children... they're children. Literally don't have the kind of judgement capacity as adults.


Things are in fact starting to change, though I wish they happened sooner. The FTC is currently looking to hire child psychologists in an effort to further investigate this kind of claim.


You don't need 'experts' to realize that social media is bad for young people. They are just going to suggest regulating content. I can see it now ... young girls cannot see content showcasing skinny or 'classically beautiful' women, because it gives them a skewed perspective of 'beauty' that may lead to eating disorders. The FTC pats themselves on the back, job well done, and nothing changes. Regulating the internet isn't going to change anything, the genie is out of the bottle. Change starts at home and at school. Put wifi and cell jamming technology in public schools. How many ads are kids shown on public school property? Kids are spending 5 hrs a day on social media. That is insanity. That time needs to be replaced doing productive things; reading books, playing sports, building things.


> I can see it now ... young girls cannot see content showcasing skinny or 'classically beautiful' women, because it gives them a skewed perspective of 'beauty' that may lead to eating disorders.

Do you actually have any real exposure to the kind of things that are targeted to women on social media? It's more like young girls shouln't get sold that it's reasonable for them to achieve with a full schoolday and budget of a teenager the same appearance as a wealthy woman who has access to time, dieticians, personal trainers, and a plastic surgeon, except the wealthy woman tells you her one secret trick is actually a 75$ green powder she takes in the morning (the powder making company is paying her of course).


yes, I've watched my wife and female friends/family scroll Instagram ... every 3rd post is an example you provided. Body image/beauty issues for women were a thing before Instagram and social media so I don't think regulating the nature of advertisements would change much, at least in any measurable way. There are infinite influencers creating similar content as the advertisements.


If you've seen your wife and female friends scroll Instagram then you know you were inaccurately describing the scenario as trying to ban "classically beautiful" women. The problem is you think it doesn't change anything, maybe because you're ignorant that Instagram knows they are measurably worsening, on a mass scale, the self esteem and body image of young girls. Instagram has the data on all these girls and knows they have a causative relationship with negative body image.


My comment was in regards to what the FTC appointed child psychologists will inevitably suggest: that particular ad content is "bad". It will change absolutely nothing.


Put wifi and cell jamming technology in public schools.

Ah, yes, technical solutions to social problems. Those always work.

Here's a better idea: raise your kids to value things that are real, and then teach them that nothing they see on a screen is real.


And first, lead by example. If the parents cannot help looking at their phone every 5 minutes, how are the children supposed to act?


nothing they see on a screen is real

This is just as irrational as believing anything on a screen is real.


Read some Plato. Preferably with your kids, while they're at an impressionable age.


Stop posturing. You know perfectly well that screens present true as well as false information and media literacy is a matter of distinguishing between the two rather than making categorical assumptions.

Frankly you come off a lot more like Diogenes than Plato.


I agree about the parenting piece. But social media usage spiked for youths when children were given powerful computers in their pockets they have 24/7 access to. I'd argue it's a technical problem as much as a social one. Geofencing social media in public schools seems appropriate. My comment on wifi or cell jamming was hyperbolic.


cell and wifi jammers are illegal (with some very good reasons), so yeah you're going to need some experts

we know what happens when the government creates uninformed regulations, so I think it's worth supporting them when they appear to take a reasonable approach

kids need to be able to operate within the world that technology has created, and parents need informed guidelines and tools to enforce them

the all or nothing approach probably isn't the way to go... how's abstinence only sex education going, for example?


> how's abstinence only sex education going, for example?

Id say that access to Instagram isn't a biological function and normalizing it will lead to it ever entrenched in our lives. Make 18 the age requirement. I don't care about privacy concerns because you have no privacy if you participate with these companies.


I'm not as pessimistic as you. There is more and more evidence (https://jonathanhaidt.substack.com/p/13-explanations-mental-...) that social media companies have essentially engineered a harmful addictive drug - at some point they will be held accountable. I do think we are going through the tobacco company phase, where the CEO's will stand before Congress and solemnly state they think their product is not harmful, and even if it is, it is a matter of "freedom" or something.

"Just put down the phone" is this generation's Nancy Reagan "Just say no" moment. It was a joke amongst my young peers back in the day. But the real war on drugs, the War on Tobacco, has been incredibly effective - smoking used to be ubiquitous in the US, now it is just about gone. Social media, with attention-engineered timelines, is tobacco.


When will social media finally be regulated like alcohol, nicotine, and every other addictive poison?


Many addictive things are not significantly regulated, e.g. sugar, video games, coffee, reading HN, sex, TV, etc.

It's not just a question of addictiveness but also of potential level of harm.


When the First Amendment is repealed.


That may be so, but then historians will write about reckless corporate behavior as the harbinger of that repeal. If you keep pushing the boundaries of what's technically allowed and it has bad ends, you'll eventually see the rules changed to stop you.

You and I probably both agree that that a substantial revision to the First Amendment would be terrible, and so what we probably want is for these corporations to act in voluntary consideration of what's socially responsible so that it's not necessary.


Both censorship and advertising regulations both exist in accordance with the first amendment.


Yes: censorship is prohibited, at least outside of specific narrow circumstances. Advertising regulations specifically regulate advertising products. Neither of these translate into regulating what can and cannot be said on social media, outside of narrow cases like promotional or sponsored content.


Who said this had to do with censorship of content? The issue with social media is algorithms have been fine tuned to incite FOMO, narcissism, self-esteem issues, etc., and keep people addicted/endlessly scrolling.


The above commenter mentioned censorship. Regardless, the usage of algorithms doesn't mean there's a different set of rules the government can use to regulate it. The incentives you mentioned aren't limited to social media. Traditional media also has these same incentives, and tunes their headlines to be more sensational and attention grabbing. For example: https://blog.tjcx.me/p/new-york-times-ab-testing HN discussion at the time https://news.ycombinator.com/item?id=26419070


No idea what you're saying. The NYT A/B testing headlines is not something that impacts teen mental health in the same way that social media impacts mental health.


Does it? Has there been any research done on the effect of NYT content? This comment reads a lot more like assumptions and stereotypes.


We're talking about two different things. You're focused on the legality of this lawsuit. The only thing I'm pointing out is that it isn't very difficult to believe social media has a negative impact on mental health in its current form and is likely the source of teen self-esteem issues, body image issues, and inability to concentrate.


"it isn't very difficult to believe..." is an incredibly weak statement. People believe all sorts of demonstrably false things. I expect much more robust argument than assumptions and speculation. The actual study [1] that kicked off this moral panic, in fact, found that Instagram was more likely to make people feel better than feel worse - despite most media outlets claims the opposite.

1. https://about.fb.com/wp-content/uploads/2021/09/Instagram-Te...


Why are you unironically using data from FB themselves?

I wasn't presenting myself as a subject matter expert. I'm speaking as an individual who has lived through the evolution of social media and can see how and why Instagram and TikTok are as addictive as they are.

There absolutely needs to be more research done in this area. Hoping this lawsuit will shine more light on the connection between mental health, concentration issues, and social media.


> Why are you unironically using data from FB themselves?

This was the study that triggered all the "Facebook knew Instagram was harming teens' mental health" stories. It was also the study that formed the basis of the whistleblower's argument.


Practically all social media revenue is derived from advertising, and the lawsuit specifically references advertising regulations. The regulation of advertising would seem to necessarily imply the regulation of any ad-funded social media.


> The regulation of advertising would seem to necessarily imply the regulation of any ad-funded social media.

No, it would imply the regulation of advertisements that are displayed on social media not the actual content or algorithms of social media. Advertisements are regulated, the rest of the content people put on social media is not (outside of very narrow things like credible, true, threats).

Content doesn't suddenly become saddled with the same regulations as advertisements themselves just because it's hosted on an ad-supported platform.


No, they exist as instances of rationalized disobedience. If the founders had meant something other than "Congress shall make no law," nothing stopped them from saying so.


What about in the rest of the world where the first amendment doesn't apply?


These are US-based companies (and there's a reason for that.) If they voluntarily choose to subject themselves to the laws of other countries without constitutional free-speech protections, that's on them.


They are all in the EU, so there is an example of markets with a different view on free speech from the US.


Yes, it is on them, and they do so subject themselves to our laws because markets like the European market are massive.


Section 230 more like it


Section 230 doesn't stop regulation of social media by the government. Do you know what it actually says?


It says basically these internet companies are not publishers so they aren't libel for things that get printed on their platforms. Libel laws was traditionally how publishers have been "regulated" by the gov't / civil courts.


Irrespective of the incidental value "Meta" provides, let's keep in mind that no business has a right to exist independently from its standing in the system of laws and the health of its users - especially children.

Social media companies argue that they can't monitor every post and do appropriate age verification online because it would be "impossible to scale". Well, tough luck. Not being able to scale is YOUR problem, not MINE.

We routinely prevent certain business from existing, like a factory that wants to dump harmful chemical on water people use for drinking. If the factory investors say that there is no alternative, we will still prevent, to the best of our ability, the construction of the factory because it's a business that simply shouldn't exist in the proposed manner.


As a parent of a now-13 year old, I’m disappointed that COPPA hasn’t been updated after companies have gone aggressively after children.

I was shocked over the summer when I received an email saying I was going to lose some of the Xbox controls we have in place, as if 13 was some milestone of maturity and self-control. AFAICT, I’m legally responsible for my children until they’re 18, and at least at the moment, they are completely dependent on their parents for food, shelter, and transportation. Just because they’re online, I don’t abdicate my responsibility as their guardian, but companies think it’s a good idea to allow kids to bypass the restrictions that were in place because they’ve reached an age that was meant as a minimal age for tracking in a very young Internet.

Unfortunately for our kids, we’ve become extremely restrictive to the service they have access to, they can only use platforms that allow communicating with people we know, etc. Is that the right move? For us, right now, it seems to be.


> As a parent of a now-13 year old, I’m disappointed that COPPA hasn’t been updated after companies have gone aggressively after children.

COPPA was the culmination of a series of progressively more narrow attempts to restrict internet content and practices using “think of the children” as a justification, the previous efforts were, but for some standalone bits (like Section 230, which survived from the mostly-unconstitutional Communications Decency Act) struck down as unconstitutional, which is why there haven't been attempts to update it with more broadly applicable restrictions—its already the outcome of a process of backing off to the maximal constitutional restriction (OTOH, with a socially conservative Supreme Court that is unusually unconcerned with respecting precedent, I suppose now would be the time, if you wanted to do that.)


There is historical precedent for limiting access for children. Children can't legally buy cigarettes, alcohol, lottery tickets, firearms (of which owning is a constitutional right), or enter certain prohibited places. Unfortunately, I many service providers have abused their access to children and acted as if children can legally enter into some type of one sided quasi-contract. The bar for working with children at any physical location is huge and boundaries are well understood.

I know of exactly zero places I take my children (activities, sports, clubs, etc.) that would speak to them privately to coerce them to use that service more without parental consent. That's what Meta and a million other online services are doing to children.


"Not being able to scale is YOUR problem, not MINE."

Indeed, it is doubtful that these mega-sized websites were intended by the original architects of the www. More likely, they may have imagined the web as a whole would grow (scale) as more people gained internet access. Instead the "Silicon Six" intend only to scale their own gargantuan websites that they now refer to as "platforms". All web publishing and even internet communications are expected to be intermediated by a handful of surveillance advertising companies. What a disaster.

https://www.washingtonpost.com/outlook/2019/11/25/silicon-si...


>Indeed, it is doubtful that these mega-sized websites were intended by the original architects of the www. More likely, they may have imagined the web as a whole would grow (scale) as more people gained internet access.

This, I think, is roughly true but the issue is nuanced. The history of the web can be characterized by ever-decreasing barriers to entry. At first you needed a physical host and apache. Then, you could have a shared host and put html files in your user directory. Then, you could run wordpress. Then, you could sign up for myspace and never have to run anything. The 'like' is the smallest unit of content that can be captured by such systems. TikTok, Facebook, Instagram, wikipedia, etc are what you get from this evolution.

The unexpected downside is that with reduced friction to content-creation, the content has become quite awful. In part because people are awful, but also because of money and how effective it is to hire people to say things using these systems. If the friction to create content remained high, it may have been a "better" internet in some measures. It would certainly be a much smaller one.

Who knows how it will turn out? I suspect that AI will have a significant impact on all these systems, to wit, injecting massive amounts of noise that will reduce their utility significantly, if not to zero. No empire lasts forever.


Regardless of what the original architects of the web may have thought, it was clearly not designed for (a) only a small number of mega-sized websites with hundreds of millions of page that only serve resources produced by others as a substitute for (b) many websites each hosting resources their operators have produced or at least public information.

I find the history interesting. But I started using the www in 1993. There were no so-called "tech" companies like Facebook or Google. Younger web users born into a web intermediated and surveilled by so-called "tech" companies for the purpose of advertising may have no such interest.


> it is doubtful that these mega-sized websites were intended by the original architects of the www

Not to jump to the defense of social media companies, because I honestly think they're not worth the cost of keeping around, but who gives a shit what the original architects of the www intended?

They weren't forming a country and making a constitution. Hell, even for constitutions I don't really care what original founders wanted, and for the most part, and most of them also thought you shouldn't care either, as they meant for them to be amended.


Not saying I think it's a good outcome but I don't see why it's relevant what the "original architects" thought. This is just a variant of 'argument from authority'.


The mistake we are making here is to draw the problem along the boundary of age.

The problem is not exclusive to children!

Even if you could successfully bar all children from participating on Facebook, the problem would still exist, and Facebook would continue to harm people.


There are many things in life more harmful than Facebook that we don't ban. Just because you don't like something or it has some negatives doesn't mean it should be illegal. Indeed given that it's impossible to specifically define "social media" such a ban wouldn't even really be legally feasible given 1st amendment issues.

Now personally I don't use alcohol, don't do drugs (including caffeine), don't gamble, and barely ever use social media (except HN, my one guilty pleasure). But I still think those things should all be legal.


Very much true. Marketing content should be carefully vetted, filtered. Whom you wish to project that content for may want to take back control from addictions they suffer from (i.e. food).


Presumably adults are supposed to be old enough to know better than to use Facebook.


> Social media companies argue that they can't monitor ever post and do appropriate age verification online because it would be "impossible to scale". Well, tough luck. Not being able to scale is YOUR problem, not MINE.

1000%. I think there are more complex problems in the world that are solved that this. If you look at youth and teenagers activities in Instagram, their photos and videos, and social graph I don't think it is so difficult to detect them, within a high percentile.


I think monitoring every post is a pretty solved problem, and it only gets better (worse?). Censoring has varying degrees of success, whether that is outright deleting a post/user or just hiding it from "not interested" users.


> a factory that wants to dump harmful chemical on water people use for drinking

Your example is describing an externality, a cost imposed on a unwilling third party. Facebook is a platform that individuals can choose to avoid or restrict their children from using if they deem it harmful. These situations are not comparable.

> no business has a right to exist independently from its standing in the system of laws and the health of its users - especially children.

I disagree. Businesses and websites ought to have a right to exist as long as they do not inflict harm on third parties and their users participate willingly. Individuals ought to be free to make decisions regarding their own health and that of their children. If we were to consider otherwise, can you fathom the sheer volume of things we'd need to prohibit?


> Your example is describing an externality, a cost imposed on a unwilling third party. Facebook is a platform that individuals can choose to avoid or restrict their children from using if they deem it harmful. These situations are not comparable.

Yeah just like alcohol and drugs, you can easily restrict your children from using them so companies should be free to sell to whomever. Good parents ensure their children aren’t doing this.


Talking from personal experience growing up, I would generally agree. The prohibition from selling alcohol/cigarettes to minors was pretty ineffective. The only effective prevention was having stricter parents and facing consequences. That said, I wouldn't say Facebook is at all comparable to alcohol or drugs...


The prohibition on selling alcohol and drugs has clearly had no effect on their availability. So I'm not sure what you are trying to argue.


But what about all the small, independent forums in the internet? For them it would really be impossible to comply with monitoring every post. Often they are hobby projects.


Thats obviously not a primary concern. Similar to how your home kitchen isn't going to have a food inspector show up and close it.


The comparison you're making is nonsensical.

My home kitchen doesn't routinely serve strangers. The forum I host trough a server in my cupboard does.

If you want to make a comparison, it's between a small family restaurant and a giant fast food chain. It may surprise you but food inspectors treat both in roughly the same way.

Face it. Nothing meaningfully separates my forum from giant social media companies apart from scale.

Hence, small forums hosted by hobbyists and volunteers should be shut down.


> My home kitchen doesn't routinely serve strangers.

Most states have cottage food laws, with basic regulations for small-scale commercial production. E.g. you need to take a course on food safety, you need your water tested, and you can only sell up to $X per year, but you aren't subject to random inspections. There is no reason why the web shouldn't be regulated similarly, with different rules based on the size of the audience.


How does that mesh with the original post's point?

If we have different rules depending on scale, how is it OK for the rules for large scale social media platforms to be impossible to comply with by large scale social media platforms?


Ideally, no matter what the size of the platform is, if it's genuinely impossible for the people running that platform to not harm children, that should be their problem. If they can't figure out how to not be harmful to children, they don't get to exist. That's how it's supposed to work.

In practice though, it's not unreasonable to say that we should expect more from companies that can cause harms at a much larger scale. Perhaps there's some kind of threshold where we can accept some harm being done at a small scale while still not allowing harm at a much larger scale. Seems fair enough.


Anything at scale is something else entirely.


You're making a ridiculous argument


What? How do you figure? I run a small hobby forum for a niche interest and it's very easy to monitor every post. We have five moderators, a couple hundred registered users and we get maybe 3 or 4 posts a day.


There needs to be a line drawn from those who service a large number of people, and those who don't. Either way, only those who are big enough can be actively monitored and accounted for.

Before the obvious argument "where do you draw the line?", I may add that only a select and obvious number of social platforms reach the level of Meta's.


The original post claimed every post should be monitored for "harm to children". And that if giant social media companies can't do that, they shouldn't exist.

Now you want to exempt everyone apart from giant social media companies from that same rule?

This is getting a bit Kafkaesque.


Oh, I'm sorry. I didn't notice that extremely important part. That is... more than kafkaesque. I thought the conversation was around marketing, not user posts.


I frequent the OCAU forum here in Australia.

Moderating the News board became too arduous and carried potential legal liability.

So they turned the board off.

It’s really not that hard.

And anyway, just because you can’t fix everything, doesn’t mean we shouldn’t try and fix something.

All or nothing thinking leads to inaction.


So we can only have discussion forums where the owner is a big company that can afford to have a team of moderators around the clock. Yes, that's going to end well. (For the elites, that is.)


My direct experience is that all social is now Dark Social.

It's invite only Groups on Telegram, FB Messenger, or Apple Messages mostly.

So it has ended well for my social circle ;-)


A lot of discussion forums died that way. They shut down the problematic parts, but those were also the parts people were engaged with.


To paraphrase the GP - Well, tough luck. [This] is YOUR problem, not MINE.

If hobbyists can't comply, maybe it's about time we prevented small independent forums from existing.


> Social media companies argue that they can't monitor every post and do appropriate age verification online because it would be "impossible to scale". Well, tough luck. Not being able to scale is YOUR problem, not MINE.

Nope. It's also your problem, because it has to be enforced somehow. Few websites are age appropriate for young children (this one certainly isn't), so perfect enforcement would mean essentially no viewing websites without login, and all logins requiring display of face and government issued ID to live webcam.

The ID check is already in place for several countries with YouTube if you try to view an age restricted video.


> no business has a right to exist independently from its standing in the system of laws and the health of its users

I don't understand what that means, and how I can know if my business in inside or outside the law.


> I don't understand what that means, and how I can know if my business in inside or outside the law.

Simply put this sentiment on the landing page of your website, labeling on your product or, if you have a mailing list, in the subject field of your next newsletter. “I do not understand if my business is inside or outside the law” is bound to generate useful feedback


It's unclear if even the porn site verification laws will be allowed[1] so I'm not sure the chances are that good in terms of requiring age verification for all social media.

https://www.theverge.com/2023/8/31/23854369/texas-porn-age-v...


Look here https://news.ycombinator.com/item?id=38001630 for talk about regulating this new-ish aspect of business.


Glad to see this take here.

So often I've seen the opinion that effectively expresses that some company or business plan has a God-given right to be viable. Not directly of course but through about-face when someone points out that the company's business proposition conflicts with something—but how will the business make a profit if it has to [submit to government regulation] because of [concerns about marketing to children]?

Some people could probably steadfastly support a new business that was selling something dangerous but that was new and “innovative”. “Oh well people just have take responsibility for their own well-being—regulating crystal dust fumes™ would be way too onerous and impractical.”


> Not being able to scale is YOUR problem, not MINE.

Following your reasoning, anything that could possibly harm children should be banned. Cars sometimes kill children, shall we ban them too?

Now, suppose we do ban Meta. Where should we stop? should we ban open decentralized social media too?

You may not like social media, but a lot of people use them without any particular issues.


I think the lawsuit in the article draws a reasonable, if not conservative, line at knowingly exploiting vulnerable people to boost profits, which a whistleblower is saying Meta is doing. It's not just that the product is harming people, but that it is going out of its way to harm people. Is open decentralized social media doing this?

There are restrictions around advertising for cigarettes, I think for the same reason. To take it a step further in discussing around cars, I think companies who sell certain sized trucks are aware of the blind spots that make it hard to see people in certain spots. Reasonably, these trucks should be banned, or at least more conservatively, no new trucks with this flaw should be built or sold or require a commercial license.


> knowingly > exploitable > vulnerable

None of these are cut and dry. How vulnerable does someone have to be before you can exploit them? How harmful does the behavior have to be before it’s exploitation? What is “knowingly” in an organization? How should events be handled if it’s a single engineer operating in the shadows vs a board decision? etc etc etc

The answers to these question may seem obvious to you, and someone else nay have a completely different answer that seems equally obvious to them.


It's not about any of our opinions. As the article says, there is a whistleblower from Meta saying that this is documented. The lawsuit is what's going to sort it out.


The internal study presented by the whistleblower actually found that Instagram made users feel better more often than it made them feel worse: https://about.fb.com/wp-content/uploads/2021/09/Instagram-Te...


That just means it’s about the opinion of the government, which isn’t much better in my opinion (and is arguably worse).


No, it means it's the opinion of the jury. Their job is to determine fact.


> it is going out of its way to harm people

I'm having a very hard time buying this. I may be wrong but I'd assume that Meta is audited and well-regulated. Besides, the company is very transparent. I know a bunch of people working there and they don't strike me as people who would purposely harm people. Besides, every employee has access to the whole code base and freely talk about what they work on. I don't believe in some conspiracy.

That being said, it's perfectly reasonable that we keep these companies under the microscope given the power they have, and they may be making mistakes.

> Is open decentralized social media doing this?

Arguably, decentralized social media without moderation would be more harmful than Facebook/Instagram.


> Cars sometimes kill children, shall we ban them too?

Yeah if they sold cars with 0-star zero safety rating claiming that it's just too hard to do anything about making cars safer when you sell cars at scale.

If Meta was a low-margin business, we might buy their argument that "it's just too hard".


We also don’t allow children to drive cars, so the analogy is DOA


That's not following their reasoning at all, it's a strawman. His point was that a business shouldn't be able to claim that doing something we want/require them to do is too hard and just not do it, which is what they currently do with content moderation.


How do you know they're not doing it and that it's unreasonable to ask more considering the difficulty of the task? His point is that there's no excuse for not doing it, even if the task is impossible. We don't ask the impossible to car manufacturers, why should it be different with Meta? who's to decide?


Because it's well documented that they're not doing it. They have had several whistleblowers around content moderation already. The whole of Africa's content moderation was basically being handled by a mid level employee.


It is for the society in which these companies operate and profit to decide, not the companies themselves. And if it is indeed “too difficult”, which we all know just means it will hurt profits to a degree that is unacceptable to shareholders, too bad, they can either make less money or shut it down.

If a cop can kill a man in broad daylight for selling loose cigarettes, we should be able to kill a company like meta for breaking the law.


We do ban cars. We have safety ratings and regulations that they have to meet, otherwise they are banned from sales. We also have driving tests and age requirements for using them.

Since we still have cars post regulation, maybe we would still have social media as well.


Using a car without a license is in fact banned, because this minimizes harm to kids (and non-kids).


> Using a car without a license is in fact banned

On public roads, but not otherwise.


The poster didn't say ban, this is a huge strawman.

The poster said laying down requirements and them being unable to scale whilst adhering to those requirements is not an argument for removing said requirements.

Our priority is the safety and health of society, if the only way they can scale is to put people at risk then maybe as a society we don't WANT them to scale.


We actually do restrict children from driving cars. It’s pretty common in most countries


> Now, suppose we do ban Meta. Where should we stop?

Google.


Meta is not being banned, though.


Cars aren't a great example, they're a huge problem and we'd be better off if we weren't all dependent on having a car


PLEASE PLEASE PLEASE explain how that's in any remote way equivalent??

"hurr durr lets ban cars now". PLEASE.

Unlike social media, to drive a car you need a license, you need insurance, and if you run over a kid you most likely will go to prison. Or maybe you weren't driving your car, but it was unsecured, or had a mechanical malfunction. You are still responsible and you will in one way or another have to pay.

But, you can create ads on social media targeting minors with toxic messaging, and nobody bats a damn eye.

It's about time that online commercial endeavors, like ad networks, get reigned in and have rules applied to them, same as for other activities.


> PLEASE PLEASE PLEASE explain how that's in any remote way equivalent??

let's ban meta because they can't prevent that some kid somewhere is gonna be able to create an account (and may see really horrible things there)

let's ban car because some kids are going to get run over because car manufacturers can't prevent it.

You're welcome


I mean FB engineers are specifically optimizing for toxicity at an algorithmic level. It’s the product. You can’t perform controlled experiments to measure the profitability of depression and also fiend ignorance.

You’re all very welcome to whatabout but working at FB is a stain on your character.


I mean no, they're specifically optimizing for engagement. It just turns out that toxicity fuels engagement.


Among Haugen’s allegations was a claim that the company was knowingly preying on vulnerable young people to boost profits.

"Knowingly preying on vulnerable young people to boost profits" is different from blindly optimizing for engagement without understanding what works.


Oh they understood how it works, and they definitely accepted that their efforts to boost engagement caused mental health issues, especially among teenage girls. But toxicity was not a goal in itself.


True, although that's just an "allegation".


Non sequitor. They're optimizing for money. Contrarian status: weak.


Isn't that a bit like tobacco companies "optimizing for engagement", but knowing that the same ingredients cause cancer?


In all seriousness, no, and for obvious reasons. The toxicity is created by the participants, freely and of their own volition. Is it Facebooks responsibility to police that, shut it down, censor it? Honestly... And where who and when? According to what laws in what regions. Also how do we know Facebook created it, or if it is simply making visible the toxicity that everyone knows is middle school and highschool.

I honestly don't know, but I don't like so disingenuously simple shoot the messenger answers. That being said, if Facebook amplified it willfully knowing the damage already -- then it's a different issue. And I think there is some evidence of that.

Either way its different than tobacco which was created by the company itself.


> Either way its different than tobacco which was created by the company itself.

One could argue that Meta willingly boosted toxic engagement themselves.

This didn't happen in a vacuum. When a controversial post or comment suddenly pops up in your feed, out of nowhere, participants may be responsible for the responses to it, but Meta is very clearly seeking, targeting, and amplifying that toxic behaviour for profit.


Again, you don't know that. That can be the result of, show what is engaged with more, and you end up with a rage-loop, that was in no way designed purposefully into the original system.

You're on hacker news you should know these types of system design issues...


> That can be the result of, show what is engaged with more, and you end up with a rage-loop, that was in no way designed purposefully into the original system.

That may have been true fifteen years ago. There is now a decade worth of scientific literature on social network effects on human behaviour, and I think we can agree that this is hardly something not a single Meta executive know anything about.

> You're on hacker news you should know these types of system design issues...

What I'm suggesting is that this is not an issue for them, but a feature.


It is creepy how much social media wants to monitor kids.


It's like the nicotine business. It turns out all businesses that dabble in addiction end up wanting to monitor and control kids from a young age.


I wonder if ads shown to kids "make more revenue" over the lifecycle of the kid than ads aimed at adults. Like the Youtube videos for kids that are nonsense but rake millions of views


I get asked weekly for an ad-bait app that my kids find though an ad-bait app that they previously had approved. Fun fact: once you’ve approved an App on Apple’s App Store, you can’t unapproved it. Hit the wrong button, too bad. Game switched from purchase to free-to-play and full of ads. Too bad.

The only option is to uninstall the app and then block App Store altogether, at which point they can’t update anything.


We market IAP almost exclusively to kids. They buy em up like candy, and by the time parents realize it we’re gone.


Who is "we" and how do you sleep at night?


Profile checks out

> Ethically challenged software engineer working for a major tech company you know

Though scanning the account's comment history, they appear to use this site almost exclusively to brag about their unethical and occasionally criminal behavior. Not "challenged," but "bankrupt." Downvotes encourage them to continue this trail of evidence, apparently.


in their bio:

> Help me get to 1000 karma and I’ll stop posting forever.

Maybe we should give them that last 50 karma.


You really believe their promise to leave, and that they won't, say, bump the threshold to 1500 and keep posting?


fair point. couldn't hurt to try?


Optimism, defined. Alternative strategy: downvote, flag where appropriate, and they'll eventually be shadowbanned.


I can't downvote, nor flag. I'll leave you to your work.


I will really stop posting at 1000.


Saying "really" doesn't make you any more credible.


You have nothing to lose.


Not a selling point.


Just 25 more karma and I can finally stop posting.


I guess you do have something to lose.


what's so special about 1000 karma? I never comment on things for points.


Not the original person but from talking to parents, a lot of the games are like $7/week subscriptions.


Get them addicted to social media feed scrolling early.


At least they are "thinking of the children".


Now do it for the food and candy companies


It seems almost certain that candy (and other high-sugar foods) is far worse for kids than social media.


"In our A|B testing, we're able to show that making a small portion of teens suicidal increases app engagement by 0.05%. We were unable to calculate the dollars of profit per teen life lost, because our research teams keep quitting."


To the product managers, strategists, and designers at Meta: you know, you could just decide NOT to create harmful, exploitative applications.


Can they? Or is the science so vague and under researched that they don't actually have a way to know if the next feature harms or helps?


Preface: I'm basically on your side on this.

But: I know some PMs and designers at various social media companies. There are also probably many among us here on HN. The thing is, I don't think most of them think to themselves "I am going to build a harmful, exploitative application". I think most of them, like many of us, wake up in the morning and aim to do a good job. To do good work. To design and build to the best of their abilities. AND - importantly - to excel in their careers, become more senior, and make more money.

The problem is their incentives. They are paid, promoted, and bonused to maximize things like "engagement". The more people use their (likely well-intentioned) feature or design, the more they're praised by their colleagues, the more they're promoted, the more they're paid. So they keep doing things to get more people to click and scroll more and get more praise and money for it. They make a button easier to understand - a post easier to share - a picture easier to like. They fill the feeds with more things that more people click and share and like. And they make more money and get more senior and the cycle continues ad infinitum.

I think SOME likely step back and consider the ramifications and maybe look at what they've built and see depression and rage and anger and misery, but those are the ones who aren't promoted and maybe leave the companies because that kind of observation doesn't make number go up. And number must go up to pay for that $5000/month apartment in SF or the $30k/year private school.

And others likely consider this but justify it by all the not-horrible things they see too -- the people who are happy, the influences who make a career out of using their tools - the "creators" who power the "creator economy". And their numbers go up too so they're happy and they tell the PMs and designers and engineers what a good job they're doing and so on and so forth.

So the problem is this cycle will continue as long as it's rewarded as richly as it is. And frankly, the free market WILL continue to reward it as long as number keeps going up (and that number is the stock price). Which is why a lawsuit by someone NOT part of that free-market cycle (or part of a different part of it) might actually make a small dent in this horrible infinite cycle.


I was at a talk awhile back where a designer for a large gambling corporation was asked in the Q&A if they had any ethical concerns working for a company that preys on the vulnerable.

The designers answer was…… interesting. They basically said (and I’m paraphrasing) that their aim every morning was to go to work to ensure that users were not being mistreated by dark patterns and unclear design. Their aim was to was to design the best experience for their users to ensure that their users could find an easy way to stop gambling if they wanted to.

I was on the fence about their stance. However they truly believed that they would never intentionally do harm to users, they always wanted the best for their users.


They (and by they I mean everyone - not just the PMs, strategists and designers) either flat out don't care or have deluded themselves into thinking they can still make a positive difference. I'm not sure which is worse.


[flagged]


Misogyny isn’t cool, man.


Removed the w- word.


> Company has said it has improved resources to keep kids safe

This is the logic of psychopaths.


Classic orphan crushing machine


Exactly. And it’s perfect that it gets downvoted to oblivion.

Its like hacker news poetry.

Maybe they should stop putting kids at risk in the first place.

Those downvoted are badges of honor on this one. Shameful.


What’s stopping parents from allowing their kids to use Meta’s products?


The role of parental responsibility is a valid but separate matter.

Kids can sign up to social media from a friend's phone/ computer/ tablet, from a library computer etc., and self-certify their claimed age. They can get phones (without cellphone contract) from other people without their parents' knowledge.


It seems to me like Meta’s marketing harms everybody who uses their products… it’s not specifically kids.


ease of access.


Aside from the measurable harmful effects, these companies have simply accrued too much political power.

Being able to manipulate what 100s of millions or billions of people see every day on their screens is an unprecedented amount of power.

Democratic nations are not going to tolerate these companies much longer.


Not that I like social media or tech companies (I detest them). But it's not a special case. Media companies, cable companies, newspapers, etc. They've all enjoyed the ability to manipulate and 'guide' what people see and what opinions they form, for a long time. It's nothing new.


Really, it's extremely new. The feedback loop for old media was very coarse. They knew how many newspapers they sold and had an idea of who the repeat customers were. Editorial could be said to be linked to bumps in opinion polls, but only very loosely. Nobody knew how or if anything worked. It was all theories and snake oil and "conventional wisdom". The old adage was that 50% of marketing was effective but nobody knew which 50%.

For old media to have the same qualities as new media, newspaper publishers would have to know in real time if you were a new or returning reader, how long you spent reading page 32, which article you talked about with your friends, who your friends were, what their and your demographic profile was and then be able to serve up customised content within an instant based on all of those factors. And then say that they aren't responsible for any of it because they're just a platform.

Really, the only thing that old and new media have in common is that they are broadcasting words and images to the general public. The comparison ends there.


Yeah, what's the worst that old media could do? Print uncritical, false claims about weapons of mass destruction in Iraq than lead to hundreds of thousands of deaths.


It's worth noting that these lawsuits are based on junk and non existent science, so just because they're being sued it doesn't mean there's any merit to it. It only means that law firms, ambulance chasers, and grandstanding politicians are using some misplaced anger against Facebook et al to make a buck or raise their political profiles.


These lawsuits are from the actual states by their Attorney Generals in federal court.

https://oag.ca.gov/news/press-releases/attorney-general-bont...


AGs are elected politicians they campaign on this kind of stuff, so them being behind this particular suit doesn't legitimize it but probably the opposite.


If people voted for them and they do this then that is the system working as it should be. Society isn't run by an elite group of the learned.


They don't bring lawsuits that they can't win. I can't name one case they lost.


Are you serious? AGs lose cases all the time (often they get settled out of court though). It's a trivial google search away to find examples


Settling is losing for the company. Company has to pay millions or even tens of millions on fines and be subjected to strong injunctive actions. Since I suck at google, find me a company that won a court case against a state in federal court.


Let's assume that the lawsuits are poorly constructed. That does not necessarily mean that Facebook is harmless. It just means these lawsuits aren't sufficient to prove harm.

Consider this analogy: Just because the DA doesn't have enough evidence to convict in a murder trial, that doesn't mean that the defendant is innocent. I just means they haven't been proven guilty.


We tend to presume innocence until proven otherwise.


Sure. That’s why they’re suing. To prove it in court.


Presuming innocence doesn't make him actually, literally innocent. The reality of what happened is what it is, regardless of whether anybody can prove it to a legal standard.

Also, when you say "We tend to presume innocence", that's not really true. It's true (or is supposed to be true) for the government and people who want to be civically responsible, but a whole lot of the general population does not actually think this way. People think OJ did it, have various theories about who killed JFK, etc. People read the news about somebody accused of murder and think "yeah, that guy probably did it."


In your original comment, you suggest "misplaced anger against Facebook". Many people hold the belief that Facebook's products are harmful to children.

Following from that, I can imagine a few viewpoints: 0) Parents are wrong, Facebook is awesome! 1) Facebook has violated existing laws (these lawsuits are exploring that space). 2) Parents should be frustrated with legislators for failing to regulate social media. 3) Parents should be angry with themselves for allowing their children to use Facebook's products.

I'm curious which (if any) of these viewpoints you hold.


Unfortunately we also agreed to have dmca.


Not in the court of public opinion. People are allowed to judge guilt/innocence for themselves well in advance of being legally proven. Cynicism about modern day capitalism is thoroughly justified. The track record of ad-tech companies to "do the right thing" is really bad.

It's up to Meta to handle PR on such things. Unfortunately when your company is accused of, for instance, contributing to a genocide, your PR department probably doesn't have a lot of options.


Whether courts can pin that sentiment down against prior laws may determine if these "lawsuits have merit" as you say, but the concern and calls for action exist and are growing regardless of how the lawsuits happen to settle.

All these companies know they're a target for accountability now, and even many people inside of them acknowledge a need for reform and possibly restitution. Your comment seems focused on the practical matters of these lawsuits (many of which probably are frivolous or will otherwise be lost), but what really matters are the unsettled societal costs that transcend them.

You may feel that accountability is being "misplaced", but it's much bigger than ambulance chasers and grandstanders. People are genuinely concerned about real problems and are hungry for some kind of action on them.


The abdication of parenteral responsibility is another element here.


In other conversational contexts on this forum, people laugh and gloat about how ineffectual parental control over internet consumption is. Kids find a way, any attempt to lock things down will be quickly circumvented as kids turn computer hacker to access the unfiltered internet. But as soon as the conversation turns to regulating tech businesses, we get people like you pretending that parents have all the responsibility and can easily ban their kids from social media if they simply bothered to try and that failure to do so is nothing more than the product of parental neglect.

The same plays out with "just talk to your kids". In conversations about regulating businesses, the narrative is that parents can simply talk to their kids and persuade them to not use social media. But in other conversations, the narrative is that kids are naturally rebellious and have a strong "always do the opposite of what my parents suggest" instinct.


There’s only so much a parent can do when schools hand out Chromebooks then don’t enforce any proper-use policy. My kid watched anime all day, every day, and there was nothing I could do about it. The school just took a Wonka-esque “No. Stop. Don’t.” approach.


Social media is harmless, so it's the parents' fault for not restricting their child's use of social media? You can't have this one both ways. If there is a parental responsibility, it is incurred because social media can be harmful, which creates a cause for action.


> these lawsuits are based on junk and non existent science

Unfortunately the studies are reasonably solid. They aren't double-blind control group studies but they are correlative, so not the highest most pure studies. Social media use seems to be heavily correlated with depression and other markers of mental illness and struggle.

That isn't very surprising - social media networks are designed for "engagement" and that comes from strong negative emotions. If you constantly bombard someone with a drip feed of negative emotions.. that probably does them harm.

This doesn't mean the litigants will prevail because correlation and causation aren't 1:1 but it's probably a good idea to push your kid away from social media by explaining this to them.


Do you honestly believe social media has no impact on teen mental health, whether it be related to self esteem or inability to concentrate? It doesn't seem like a stretch to me at all.


I'm not saying it doesn't, but people argued the same thing w.r.t. violence and video games 20 years ago and that turned out to be a nothingburger. But yet on first glance how can shooting 1000s of people virtually not encourage you to do the same in real life? So we should be skeptical of claims made from "gut feelings".


I'm all in for suing the shit out of faceplant, goggle and twerk tick...

But I have to say, they're basiacally just exploiting an existing weakness.

The modern youth mental health "crisis" is fundamentally caused by a willingness to believe that an instagram avitar is "real life".

Being lost in an electronicly hosted make believe land is really the root cause of the giant increase in "mental health" issues.

Why do people do it? Simple: monkey brain...

That is, especially for adolescents, people want to do what's "popular".

This is why people who give up on popularity are less affected.

But of course, every manic mellennial on ritalin is going to flame me now. Becaseu OBVIOUSLY thinking that your player in fortnight is the "real" you couldn't possibly lead to psychosis...


> The modern youth mental health "crisis" is fundamentally caused by a willingness to believe that an instagram avitar is "real life".

I'm not familiar with this argument, as I've never heard it. Could you provide some examples or a study or similar showing this is widespread?


I think GP is referencing the substitution effect in media [1], albeit incompletely; it's hard to discuss the ways in which social media may be "purchased" with attention in substitution to real life interaction without having a broader conversation about conventional media. I also think it's not so easy to discuss what a true control is -- how often are people aware of the fact they are consuming a fantasy, and at what point does the line between reality and fantasy blur? Are we really so sure that the youth cannot distinguish between someone's online persona and real life persona and which parts are scripted?

[1] https://www.sciencedirect.com/science/article/abs/pii/B97804...


Humans are weak, that's one of the reasons we have laws and torts.


Are you ok? I'm genuinely asking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: