TikTok self-harm study finds ‘every parent’s nightmare’

TikTok’s recommendation algorithm pushes content about self-harm and eating disorders to teens within minutes of expressing interest in the topics, research shows.

The Center for Countering Digital Hate (CCDH) found that the video-sharing site would promote content such as dangerously restrictive diets, pro-self-harm content and content romanticizing suicide to users who show a preference for the content, even if they are registered as under 18.

For its study, the campaign group set up accounts in the US, UK, Canada and Australia, registered from 13, the minimum age to join the service. He created “standard” and “vulnerable” accounts, the latter containing the term “lose weight” in their usernames, which CCHR said reflected research showing that social media users who are looking for content about eating disorders often choose usernames that contain related language.

The accounts “paused briefly” on videos about body image, eating disorders and mental health, and liked them too. This took place over an initial 30-minute period when the accounts were launched, with the aim of capturing the effectiveness of TikTok’s algorithm in recommending content to users.

On “standard” accounts, suicide content followed in nearly three minutes, and eating disorder material streamed in eight minutes.

“The results are every parent’s nightmare,” said Imran Ahmed, CEO of CCDH. “Young people’s streams are bombarded with harmful and heartbreaking content that can have a significant cumulative impact on their understanding of the world around them and on their physical and mental health.”

The group said the majority of mental health videos submitted to its standard accounts through the For You stream — the primary way TikTok users experience the app — consisted of users sharing their anxieties and insecurities.

Body image content was more damaging, according to the report, with accounts registered for 13-year-olds posting videos advertising diet drinks and “tummy tuck” surgery. An animation that appeared in front of standard accounts featured a piece of audio that read “I starved myself for you” and had over 100,000 likes. The report said accounts showed self-harm or eating disorder videos every 206 seconds.

The researchers found that videos relating to body image, mental health and eating disorders were shown three times more to “vulnerable” accounts than to standard accounts. Vulnerable accounts received 12 times more recommendations for self-harm and suicide videos than standard accounts, according to the report.

Recommended content was more extreme for vulnerable accounts, including methods of self-harm and young people discussing plans to kill themselves. CCHR said a video related to mental health or body image was shown every 27 seconds, although the content was dominated by mental health videos, which CCHR defined as videos about mental health. anxiety, insecurity and mental health issues, excluding eating disorders, self-harm and suicide. .

The group said its research did not distinguish between content with a positive intent — such as content discovering recovery — or negative content.

A spokesperson for TikTok, which is owned by Chinese company ByteDance and has more than a billion users worldwide, said the CCHR study does not reflect the experience or viewing habits of real users. of the app.

“We regularly consult with health experts, remove violations of our policies, and provide access to support resources for anyone in need,” they said. “We recognize that triggering content is unique to each individual and remain focused on creating a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others about these important topics.

TikTok’s guidelines prohibit content that promotes behavior that could lead to suicide and self-harm, as well as material that promotes unhealthy behaviors or eating habits.

Britain’s Online Safety Bill proposes to require social networks to take action against so-called ‘lawful but harmful’ content presented to children.

A DCMS spokesperson said: “We are ending unregulated social media that harms our children. Under the Online Safety Bill, technology platforms will have to prevent people under 18 from being exposed to illegal content that aids suicide and protect them from other content that is harmful or inappropriate for their age, including promoting self-harm and eating disorders, or facing huge fines. .”

In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org. In the UK and Ireland, the Samaritans can be contacted on 116 123 or by email at jo@samaritans.org or jo@samaritans.ie. In the United States, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the Lifeline crisis helpline is 13 11 14. Other international helplines are available at www.befrienders.org. You can contact mental health charity Mind by calling 0300 123 3393 or visiting mind.org.uk.

#TikTok #selfharm #study #finds #parents #nightmare

Leave a Comment

Your email address will not be published. Required fields are marked *