Mehul Reuben DasDec 12, 2022 2:59:03 PM IST
If there is anything Elon Musk’s Twitter saga and Twitter Files showed us, is that content moderation by social media platforms is anything but simple. social media platforms like Instagram and Facebook needs to strike a balance between making a user’s feed as engaging as possible and keeping users, especially impressionable users, away from harmful content. This is where most social media platforms fail miserably.

Whether it’s Twitter, Instagram or Facebook, content moderation is driven by profits and, as the Twitter files show, by ideology more than established politics. Instagram, in particular, has often been accused of being influenced by profits while moderating content. Image credit: AFP
A never-before-seen document that has not been disclosed by Meta shows that the people who ran Meta when it was still called Facebook knew that Instagram was intentionally pushing young teenage girls into dangerous and harmful content, and did nothing. made to stop it.
The document reveals how an Instagram employee investigated Instagram’s algorithm and recommendations, posing as a 13-year-old girl seeking dietary advice. Instead of displaying user content from appropriate medical and fitness experts, the algorithm opted to display content from more viral topics that garnered more engagement, which was adjacent to proper diet. These “adjacent” viral topics turned out to be content around anorexia. The user was taken to graphical content and recommendations to follow accounts labeled “skinny binge” and “apple core anorexic”.
It’s a known fact that Instagram was aware that nearly 33% of all teenage users on the platform feel worse about their bodies due to the content the app recommends and the algorithm used. by Insta to organize a user’s feed. Instagram also knew that teens who used the app experienced higher rates of anxiety and depression.
This isn’t the first time Instagram’s algorithms and the content it pushes onto users have been a point of contention for mental health experts and advocates. Earlier this year Instagram has been officially listed as the cause of death by a UK coroner in a case involving a 14-year-old girl named Molly Russell, who died by suicide in 2017.
In the case of Molly Russell, one of the key areas the lawsuit focused on was whether Molly looking at thousands of posts on platforms like Instagram and Pinterest promoting self-harm had anything to do with the caused her to commit suicide. In his testimony as coroner, Andrew Walker concluded that Russell’s death could not be considered a suicide. Instead, he described his cause of death as “an act of self-harm while suffering from depression and the negative effects of online content.” Walker at one point described content that Russell enjoyed or recorded in the days before his death as so disturbing that he found it “almost impossible to watch”.
“The platforms operated in a way that used algorithms to drive, under certain circumstances, frenzied periods of images, video clips and text”, which “romanticized acts of self-harm” and “seeked to isolate and discourage discussion with those who could have helped,” Walker said.
Cases like these have opened up debate about the content moderation policies of social media platforms and how they play out in real life. Attorney Matt Bergman created the Social Media Victims Law Center after reading the Facebook Papers, which were leaked by whistleblower Frances Haugen last year. He now works with over 1,200 families pursuing lawsuits against social media companies.
“Whenever, when given the opportunity to choose between the safety of our children and profits, they always choose profits,” Bergman said in an interview with a US news agency. He argues that the design of social media platforms ultimately harms children.
Money and control
Facebook and Instagram let celebrities and politicians break its content moderation rules because it was more profitable, a damning verdict from its own oversight board reveals! #1A Daily Mail Online https://t.co/b0h1xlDmsp— CyberChick (@warriors_mom) December 7, 2022
“They intentionally designed a product that is addictive,” Bergman said. “They understand that if kids stay online, they make more money. It doesn’t matter how harmful the material is. Bergman argues apps were explicitly designed to evade parental control and calls for better protocols age and identity verification.
Meta Global Chief Safety Officer Antigone Davis says ‘we want teens to be safe online’ and Instagram doesn’t allow content promoting self-harm or eating disorders . Davis also said Meta improved Instagram’s “age verification technology.”
Several activists and advocacy groups are of the view that content moderation across all platforms needs an overhaul. While the broadest consensus is that social media platforms should have independent moderation boards and should self-regulate content, others expressed the need for a broader, global body that sets moderation policies. content moderation.
I hope we are all now completely weary of the idea that there is objectivity or standards applied to all content moderation on Twitter. Or anywhere else. They make it up as they go and call you stupid for questioning it. https://t.co/Vy3jexpiTe
— Rachel Bovard (@rachelbovard) December 11, 2022
Today I filed my first amended complaint against @Twitter , @instagram and @Meta for their flagrant failure to enforce their own content moderation policies against the Ayatollah and his terrorist regime. HIGHLY recommend everyone to read it. https://t.co/WhSrGZyWtO
—Rumi Parsa (@rumiparsa) December 5, 2022
Removing content moderation from platforms and appointing an independent board that oversees the moderation policies of all social media platforms opens up a whole new Pandora’s box. For example, it will be much easier for regimes to suppress political dissent and news that may be unfavorable to a regime. What’s this exactly Twitter Files is trying to show. The fact remains, however, that content moderation as we know it is broken and needs fixing, stat.
#Leaked #documents #reveal #Meta #knew #Instagram #pushing #girls #harmful #content #harms #mental #health #Technology #News #Firstpost