Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The FTC is looking to hire child psychologists to evaluate the affects of these platforms, so expect more of this to come

https://www.cnbc.com/2023/10/23/ftc-plans-to-hire-child-psyc...

There's a thin line to be walked between privacy and regulation, but Facebook's own research has pointed at the harmful effects of Instagram on the mental health of children... so I think some scrutiny here is overdue.

https://www.theverge.com/2021/10/6/22712927/facebook-instagr...



> Facebook's own research has pointed at the harmful effects of Instagram on the mental health of children

Facebook’s own research has found both positive and negative effects, depending on the child’s status among their peers.

The Verge is quick to point out that:

> Instagram makes teen girls feel worse about their bodies

but more discreet in reminding people that fashion magazines, including The Cut (also published by Vox), have the same negative effects and none of the positive ones, like helping gender minority find a safe space, re-socialising people with physical handicaps or anxiety. The critique makes sense —I’m not denying it— but it doesn’t come. from “the algorithm.” It comes from having a lot of money poured into the Instagram creation ecosystem. Instagram has very little control over that as most of it is through brand deals or magazine raising their own profile. That behemoth of money and attention comes from an industry that has decades of well-documented problematic practices behind it, including the unrepentant glorification of using self-starvation and drugs, notably cocaine, to reach unhealthy thinness.

If you condemn Instagram but don’t include that industry among the culprits, you’ll have Snap with unhealthy models, TikTok with unhealthy models, or whatever comes next with unhealthy models.


How does Instagram re-socialise people with physical handicaps?


Allowing them to find friends which have similar positions/situations who understand what they live with.


Hmm that's any piece of connected software on the internet though. Shutting down instagram wouldn't compromise that.


Connected software, with very few exceptions, doesn’t attract a very large audience. For rarer situations, you aren’t going to find that person easily. Having worldwide interest groups (like subreddits, like thematic Discord channels) helps, but is conditioned to people using either Reddit or Discord. Facebook and Instagram still win compared to those two contenders.

That’s of course assuming that Reddit and Discord wouldn’t get much bigger if Meta were to close, but I’m not sure the negative effects would vanish because the owner is different.


is it safe to assume that you've worked at meta at some point? because this comes across as a little defensive

The post is about instagram so I'm talking about instagram, feel free to post an article about magazines if you want to talk about magazines... legislation aimed towards instagram will create precedence to pursue other platforms like tiktok and can also become ingrained as policy. Just because we're focusing on one subject here doesn't mean we're ignoring the others or playing favorites.


The post isn’t about Instagram at the exclusion of the fashion ecosystem: it’s an argument that truncates research that does find that posts that the drop in self-esteem is directly related to seeing posts either from fashion professionals or peers trying to emulate those. You can’t read those studies and think that detail is not part of the key findings.

Those are not two similar but separate problems.

This is like saying that mail bombs are the problem of the post office and not people having access to explosives: the problem isn’t envelopes. It’s people with access to explosives, and FedEx would likely have the same issue—unless federal rules allow them to scan and to refuse to deliver explosives.

I’m not being crass with a bad metaphor: focusing on the support has been a key issue in the argument for a while. Fashion magazines used to be decried as “glossy paper,” but no one thought the difference between magazines sending problematic self-image and serious news was that newspapers came on broad sheets of mate paper. Still, it’s now how the problem is presented: without clear separation.

I did work at Meta on the team looking at Teenagers, and I did raise that point internally: should we look at gambling and alcohol and extend the same rules to fashion? Should we boost peers over brands to avoid problematic ideation?

Those conversations, without the shadow of fashion partners, were generally productive. It wasn’t perfect: the goal remained “engagement” but there were no sacred cows to avoid.

As soon as the findings about teenage body dysmorphia were put in the context of fashion (and presented to one particular executive who cared about advertising more than algorithmic boost), that question was buried. Several friends of mine got blackballed hard not for suggesting Instagram-specific treatment, but for instance, for asking Anna Wintour about the quip where she fat-shamed her best friend in _The September Issue_ (it’s a niche fashion reference, but it’s widely considered a smoking gun in the industry).

There are inherent biases to how Instagram and Facebook work: you post when you achieve something, and there’s a bias towards success, but the internal findings rarely found that those were crippling. Thinness, on the other hand, leads to clear, widespread medical issues. I remember asking if the issue became more prevalent because teenagers didn’t have access to that many magazines before but that the dose effect was comparable. I don’t think anyone has looked into that.

I’m not saying that to disengage Meta’s responsibility: I had argued for explicit filters, like giving teenage boys who know they can’t resist and will behave in a way they think is unhealthy, the ability to exclude scantly clad women in their feed. I used another example, naturally (more typical of internal debate). That idea could have had legs ten years ago; now, with the debate being a lot less constructive, I'm not sure.


the post office did actually implement stricter screening, including x-rays, chemical detection, and GPS tracking, due to mail bombs... in a way mail bombs sped up the process of implementing package tracking systems for everyone — they took a very serious approach to safety despite not being the direct cause of the problem

from my perspective, this is the kind of regulation the FTC should seek out — so for different reasons I agree with your analogy

magazines are an entirely different beast and it's strange to not address that — I have to physically go out of my way to purchase a magazine or have one delivered... that's something I opt in to

people opt-in to instagram, but for teenagers there are much larger stakes involved... instagram has become an integral part of many teens' social life (by design) and they're constantly getting targeted and personalized ads fed to them directly

the fact that the goal remains engagement makes it pretty clear where meta's priorities lie, and it's not on the side of safety


I’m good with damming them all and getting rid of shit technology directions that hurt people, especially kids.

Who is actually ok with this?


The people making money from it; either the company itself or the employees working for it that are designing it. They clearly do not have issues with it.


lol. The meta drones showed up to downvote the comment you replied to, so I’m going to say you are spot on.


Here's the original Facebook internal research, if people want to compare and contrast the actual results with the media coverage: https://about.fb.com/wp-content/uploads/2021/09/Instagram-Te...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: