The Wall Street Journal published a report that delves into the personal experiences of young girls who have undergone intense weight loss challenges and killer diets through Tik Tok, contributing to the development of eating disorders, or making existing disorders worse.
The newspaper conducted its own experiment to see how the platform's algorithm could promote this type of malicious content. Its findings may explain TikTok's sudden decision to change the way its video recommendation system works.
As detailed in the report, the Wall Street Journal created more than 100 accounts that browsed the app with minimal human intervention, 12 of which were registered bots of 13-year-olds spending time watching videos related to weight loss, alcohol and gambling.
The graph included in the report shows that once a bot suddenly stopped watching gambling videos and started spending time on weight loss videos instead, the platform's algorithm adjusted itself accordingly. And it rapidly increased the number of weight loss videos the robot saw to explain this shift in behaviour.
By the end of its experiment, the newspaper found that out of a total of 255,000 videos the robots viewed, 32,700 of them contained descriptions or metadata matching a list of hundreds of keywords related to weight loss.
And 11,615 videos contained text descriptions that matched keywords related to eating disorders. While 4402 contained a set of keywords that indicate the normalization of eating disorders.
A number of these videos are said to have been using different spellings to address keywords related to the disorder to avoid being flagged by TikTok.
And after the Wall Street Journal alerted the platform to a sample of 2,960 videos linked to the eating disorder. 1,778 videos removed.
The Wall Street Journal says it is unclear whether it was removed by TikTok or by the content creators themselves.
A day before the WSJ report was released, the platform announced that it was working on new ways to prevent this dangerous phenomenon.
The change also occurred a few days after the newspaper said it had reached out to the platform to obtain a statement about its upcoming report. As a result, it is possible that TikTok proactively published the update before the report was published.
The platform says it's not always healthy to view certain types of content over and over again. Including videos related to extreme diet and fitness.
The platform is now working on a way to find out if its recommendations system is inadvertently submitting videos that may not violate its policies. But these clips may be harmful if consumed in excess.
The platform also says it is testing a tool that allows users to stop videos containing certain words or hashtags from appearing on the For You page.