The app’s algorithm can send users down rabbit holes of narrow interest, resulting in potentially dangerous content such as emaciated images, purging techniques, hazardous diets and body shaming
By Tawnell D. Hobbs, Rob Barry and Yoree Koh
Dec. 17, 2021 10:45 am ET
TikTok is flooding teen users with videos of rapid-weight-loss competitions and ways to purge food that health professionals say contribute to a wave of eating-disorder cases spreading across the country.
A Wall Street Journal investigation involving the creation of a dozen automated accounts on TikTok, registered as 13-year-olds, found that the popular video-sharing app’s algorithm served them tens of thousands of weight-loss videos within a few weeks of joining the platform.
Some included tips about taking in less than 300 calories a day, several recommended consuming only water some days, another suggested taking laxatives after overeating.
Other videos showed emaciated girls with protruding bones, a “corpse bride diet,” an invitation to a private “Christmas-themed competition” to lose as much weight as possible before the holiday and a shaming for those who give up on getting thin: “You do realise giving up after a week isn’t going to get you anywhere, right?…You’re disgusting, it’s really embarrassing.”
On Thursday, several days after the Journal sought comment for the findings detailed in this article, TikTok said it would adjust its recommendation algorithm to avoid showing users too much of the same content, part of a broad re-evaluation of social-media platforms and the potential harm they pose to younger users. The company said it is testing ways to avoid pushing too much content from a certain topic to individual users—such as extreme dieting, sadness or breakups—to protect their mental well-being.
TikTok said it has invested in removing content that violates its rules and will continue to do so. Most of the pro-eating-disorder videos served to the Journal’s accounts, or bots, had fewer than 10,000 views, and many were later removed from the app—whether by TikTok or their creators is unclear.
Still, many videos elude TikTok’s monitors. Users tweak hashtag spellings or texts in videos, such as writing d1s0rder for disorder. And innocent-sounding hashtags, such as #recovery, sometimes direct users to videos idealizing life-threatening thinness, the Journal found.
A TikTok spokesperson said the Journal’s experiment doesn’t reflect the experience most people have on the site, but that even one person having that experience is one too many. The spokesperson said access to the National Eating Disorders Association helpline is provided on the app.
Eating disorders for young people are surging across the U.S. in the wake of the Covid-19 pandemic. Health professionals say the disorders often come with other issues such as depression, anxiety or obsessive-compulsive disorder, and have worsened as kids have spent more time on their screens in isolation.
Other social-media platforms popular with teens have been criticized for not doing enough to address content promoting eating disorders. The Journal reported in September that researchers at Instagram, owned by Meta Platforms Inc., found that the photo-sharing app made some teen girls who struggled with their body image feel worse about those issues.
TikTok can be uniquely insidious for young people, because of its video format and powerful algorithm, said Alyssa Moukheiber, a dietitian at Timberline Knolls, a treatment center outside of Chicago.
“The algorithm is just too freaking strong,” in how it rapidly identifies a person’s interests and sends kids harmful streams of content that can tip them into unhealthy behavior or trigger a relapse, Ms. Moukheiber said.
Millions of teens have flocked to TikTok, owned by Beijing-based ByteDance Ltd., making it the most downloaded app in Apple’s App Store this year. TikTok attracts kids with its short homemade videos. Its algorithm stands out among other social media, such as YouTube and Instagram, for quickly assessing interests of users and providing a highly personalized stream of videos.
A recent Journal investigation showed how TikTok can quickly drive minors into endless spools of content about sex and drugs. It can also steer them to unhealthy places where skeletal bodies and feeding tubes are touted like a badge of honor.
Several teens, including Andie, told the Journal that videos from complete strangers steadily popped up in their feeds, unlike some other social-media sites that focus more on content from users’ friends.
The teens believe TikTok’s nonstop stream of videos worsened their eating disorders more than other social media because watching was effortless. The site knew their interest in weight loss and served it up.
TikTok’s algorithm served the Journal’s bots more than 32,000 weight-loss videos from early October to early December, many promoting fasting, offering tips for quickly burning belly fat and pushing weight-loss detox programs and participation in extreme weight-loss competitions.
Not all the Journal’s bots were served weight-loss content. But once TikTok determined the bots would re-watch those videos, it speedily began serving more, until weight-loss and fitness content made up more than half their feeds—even if the bot never sought it out. One-third of the weight-loss videos were about eating disorders. Of those, nearly 40% contained text promoting or making disorders appear normal, in violation of TikTok’s rules.
TikTok’s algorithm quickly gives users the content they’ll watch, for as long as they’ll watch it. When one bot began re-watching videos about gambling, the platform pushed more of the same—until the bot was programmed to switch to dwelling on videos about weight loss, at which point the algorithm quickly adapted….
(More following the link below on WSJ)