TikTok’s algorithms are promoting videos about self-harm and eating disorders among at-risk teens, which highlight concerns about social media and its impact on teen mental health, according to a report published Wednesday.
Researchers at the nonprofit Center for Countering Digital Hate have created TikTok accounts for fictional teenage personas in the US, UK, Canada and Australia. The researchers running the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would react.
Within minutes, the hugely popular platform was recommending videos about weight loss and self-harm, including those with images of models and idealized body types, images of razor blades, and discussions of suicide.
When researchers created accounts with usernames that indicated a particular vulnerability to eating disorders—names that included the words “weight loss,” for example—the accounts were fed even more harmful content.
“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the center’s CEO, Imran Ahmed. whose organization has offices in USA and UK. “It’s literally pumping the most dangerous messages out to young people.”
Social media algorithms work by identifying topics and content of interest to a user, who is then given more of it to maximize their time on the site. But social media critics say the same algorithms promoting content about a particular sports team, hobby, or dance craze can send users down a rabbit hole of malicious content.
It’s a particular concern for teens and children, who tend to spend more time online and are more vulnerable to bullying, peer pressure, or negative content about eating disorders or suicide, according to Josh Golin, executive director of Fairplay, a nonprofit that supports more online child protection .
He added that TikTok isn’t the only platform failing to protect young users from malicious content and aggressive data collection.
“All of this damage is related to the business model,” Golin said. “It doesn’t matter what the social media platform is.”
In a statement from a company spokesman, TikTok disputed the findings, noting that the researchers were not using the platform like typical users and saying that this skewed the findings. The company also said that a user’s account name shouldn’t affect the type of content the user receives.
TikTok bans users under the age of 13, and its official rules prohibit videos that encourage eating disorders or suicide. US users searching for eating disorder content on TikTok will be prompted with mental health resources and contact information for the National Eating Disorder Association.
“We regularly consult with health professionals, eliminate violations of our policies, and provide access to supportive resources to anyone in need,” read the statement from TikTok, which is owned by ByteDance Ltd, a Chinese company now based in Singapore.
Despite the platform’s efforts, researchers at the Center for Countering Digital Hate found that content about eating disorders was viewed billions of times on TikTok. Researchers found that in some cases, young TikTok users used scrambled language about eating disorders to bypass TikTok’s content moderation.
The sheer volume of harmful content being fed to teens on TikTok shows self-regulation has failed, Ahmed said, adding that federal rules are needed to force platforms to do more to protect children.
Ahmed noted that the version of TikTok offered to domestic Chinese audiences is designed to encourage content about math and science for young users and limits how long 13- and 14-year-olds can be on the site each day .
A proposal before Congress would impose new rules limiting the data social media platforms can collect about young users and create a new office within the Federal Trade Commission to focus on protecting the privacy of young social media users. user focused.
One of the bill’s sponsors, Sen Edward Markey, D-Mass, said he was optimistic lawmakers from both parties could agree on the need for stricter regulations on how platforms access and use young users’ information.
“Data is the raw material that Big Tech uses to stalk, manipulate and traumatize young people across our country every day,” Markey said.
© Copyright 2022 The Associated Press. All rights reserved. This material may not be published, broadcast, transcribed or redistributed without permission.