Content moderation is guided by profits and by ideology more than policy:
If there's anything that Elon Musk's Twitter saga and Twitter Files has shown us, its that content moderation by social media platforms is anything but straightforward. Social media platforms like Instagram and Facebook need to strike the balance between making a user's feed as engaging as possible, and keeping users, especially impressionable users away from harmful content. This is where most social media platforms fail miserably.
A previously unpublished document that has not been leaked from Meta, shows that the people heading Meta when it was still called Facebook, knew that Instagram was intentionally pushing young teenage girls to dangerous and harmful content, and did nothing to stop it.
The document reveals, how an Instagram employee ran an investigation on Instagram's algorithm and recommendations, by pretending to be a 13-year-old girl looking for diet tips. Instead of showing the user content from medical and proper fitness experts, the algorithm chose to show content from more viral topics that got more engagement, which was adjacent to having a proper diet. These "adjacent" viral topics turned out to be content around anorexia. The user was led to graphic content and recommendations to follow accounts titled "skinny binge" and "apple core anorexic."
[...] "Time after time, when they have an opportunity to choose between safety of our kids and profits, they always choose profits," said Bergman in an interview with a news agency in the US. He argues the design of social media platforms is ultimately hurting kids.
[...] "They have intentionally designed a product that is addictive," Bergman said. “They understand that if children stay online, they make more money. It doesn't matter how harmful the material is." Bergman argues the apps were explicitly designed to evade parental authority and is calling for better age and identity verification protocols.
Related Stories
Blogger Ben Werdmuller has discussed an article in Nature about the political impact of the algorithm(s) used by X (formerly known as Twitter). The gist is that the use of the algorithms against X's users tends to shift about 5% of them in a specific direction. That's more than enough to tip an election one way or another especially since the damage seems persistent and lasts even after exposure ceases.
Feed algorithms are widely suspected to influence political attitudes. However, previous evidence from switching off the algorithm on Meta platforms found no political effects. Here we present results from a 2023 field experiment on Elon Musk's platform X shedding light on this puzzle. We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks, measuring political attitudes and online behaviour. Switching from a chronological to an algorithmic feed increased engagement and shifted political opinion towards more conservative positions, particularly regarding policy priorities, perceptions of criminal investigations into Donald Trump and views on the war in Ukraine. In contrast, switching from the algorithmic to the chronological feed had no comparable effects. Neither switching the algorithm on nor switching it off significantly affected affective polarization or self-reported partisanship. To investigate the mechanism, we analysed users' feed content and behaviour. We found that the algorithm promotes conservative content and demotes posts by traditional media. Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm, helping explain the asymmetry in effects. These results suggest that initial exposure to X's algorithm has persistent effects on users' current political attitudes and account-following behaviour, even in the absence of a detectable effect on partisanship.
It should be added that the effect has already been seen in multiple countries. For example, the elections in Turkey were affected with outright censorship, within X. And the impact from the CPP's Bytedance's Tiktok is likely even more severe, not to mention multiple experiments in manipulation in Meta's properties like Facebook.
Journal Reference: Gauthier, G., Hodler, R., Widmer, P. et al. The political effects of X's feed algorithm. Nature (2026). https://doi.org/10.1038/s41586-026-10098-2
Previously:
(2026) How Screwed is Generation Alpha, and the Generations Which Will Depend on Them?
(2025) European Union Orders X to Hand Over Algorithm Documents
(2024) Six Months Ago NPR Left Twitter. The Effects Have Been Negligible
(2023) Utah Sues Tiktok For Getting Children 'Addicted' To Its Algorithm
(2022) Leaked Documents Reveal Instagram Was Pushing Girls Towards Content That Harmed Mental Health
(2022) Musk Buying Twitter Is Not About Freedom of Speech
... and more
The US Surgeon General has published his 2023 advisory on social control media and youth mental health [warning for PDF]. The report's scope is only on the health and mental health effects, not the weaponized nature of the phenomenon. The body of the report is 17 pages long and includes a call to action.
Extreme, inappropriate, and harmful content continues to be easily and widely accessible by children and adolescents. This can be spread through direct pushes, unwanted content exchanges, and algorithmic designs. In certain tragic cases, childhood deaths have been linked to suicide- and self-harm-related content and risk-taking challenges on social media platforms. This content may be especially risky for children and adolescents who are already experiencing mental health difficulties. Despite social media providing a sense of community for some, a systematic review of more than two dozen studies found that some social media platforms show live depictions of self-harm acts like partial asphyxiation, leading to seizures, and cutting, leading to significant bleeding. Further, these studies found that discussing or showing this content can normalize such behaviors, including through the formation of suicide pacts and posting of self-harm models for others to follow.
Social media may also perpetuate body dissatisfaction, disordered eating behaviors, social comparison, and low self-esteem, especially among adolescent girls. A synthesis of 20 studies demonstrated a significant relationship between social media use and body image concerns and eating disorders, with social comparison as a potential contributing factor. Social comparison driven by social media is associated with body dissatisfaction, disordered eating, and depressive symptoms. When asked about the impact of social media on their body image, nearly half (46%) of adolescents aged 13–17 said social media makes them feel worse, 40% said it makes them feel neither better nor worse, and only 14% said it makes them feel better.
Previously:
(2023) Seattle's Schools are Suing Tech Giants for Harming Young People's Mental Health
(2022) Leaked Documents Reveal Instagram Was Pushing Girls Towards Content That Harmed Mental Health
(2022) Social Media Break Improves Mental Health
(2021) Facebook Documents Show How Toxic Instagram is for Teens, Wall Street Journal Reports
(Score: 2) by mcgrew on Tuesday December 20 2022, @04:04PM (1 child)
No! Not in "Christian", apple pie America! Are you sure you're not talking about 1940 Italy?
Are the Republicans really in favor of genocide, or are they just cowards terrified of terrorist twit Trump?
(Score: 2) by RamiK on Tuesday December 20 2022, @06:34PM
If the author's suggestion the Twitter files taught us something means anything, it's that the author and their audience have been living under a rock for most of their lives.
compiling...
(Score: 0) by Anonymous Coward on Tuesday December 20 2022, @10:56PM
If you are "pretending to be a 13-year-old girl" [*] on the internet, Instagram will steer you toward harmful content.
--
* Haven't we all?