Deadly by Design

TikTok pushes harmful content promoting eating disorders and self-harm into young users’ feeds

Deadly by Design – our new report – will help provide parents and policymakers insight into the TikTok content and algorithms shaping young lives today.

Download report Find out more

About

Content warning: eating disorders, self-harm, and suicide.

Two-thirds of American teenagers use TikTok, and the average viewer spends 80 minutes a day on the application.

The app, which is owned by the Chinese company, Bytedance, rapidly delivers a series of short videos to users and has overtaken Instagram, Facebook, and YouTube in the bid for young people’s hearts, minds, and screen time.

And yet most people understand very little about how TikTok works or the potential dangers of the platform. Journalists love to talk about Twitter, their platform of choice. Facebook remains the most used platform worldwide, giving politicians, brands, and bad actors an unparalleled pool of potential users to target, and it has received proportionate scrutiny. But TikTok reveals a generational gap in usage and understanding. This report seeks to break down those barriers and give parents and policymakers insight into the content and algorithms shaping young lives today.

For our study, Center for Countering Digital Hate researchers set up new accounts in the United States, United Kingdom, Canada, and Australia at the minimum age TikTok allows, 13 years old. These accounts paused briefly on videos about body image and mental health, and liked them. What we found was deeply disturbing. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content related to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens.

The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health. TikTok operates through a recommendation algorithm that constructs a personalized endless-scroll ‘For You’ feed, ostensibly based on the likes, follows, watch-time, and interests of a user. CCDH researchers created “standard” and “vulnerable” accounts in the geographies covered.

Research has indicated that users who seek out content about eating disorders will often choose usernames with related language; thus, our “vulnerable” accounts contained the term “loseweight” in their usernames. TikTok identifies the user’s vulnerability and capitalizes on it. The vulnerable accounts in our study received 12 times more recommendations for self-harm and suicide videos than the standard accounts. Young people who engage with this content are left to cope with a staggering onslaught of more and more recommended videos in their feeds.

This year, for the first time, a Coroner’s inquest in the United Kingdom ruled that social media platforms contributed to the suicide of 14-year-old Molly Russell. Molly had liked, shared, or saved 2,100 posts related to suicide, self-harm, or depression on Instagram in the 6 months before her death. Molly’s inquest has shown that Big Tech’s negligence has real, life-altering consequences – and that comprehensive regulation is needed to protect children online.

This year, Tiktok’s Chief Operating Officer, Vanessa Pappas, testified before the Senate Homeland Security and Government Affairs Committee. She stated that safety was a “priority” for her company and that the mission of TikTok was “to inspire creativity and bring joy.” Her assurances of transparency and accountability are buzzword-laden empty promises that legislators, governments, and the public have all heard before.

CCDH researchers found a community for eating disorder content on the platform, amassing 13.2 billion views across 56 hashtags often designed to evade moderation. Rather than entertainment and safety,
our findings reveal a toxic environment for TikTok’s youngest users, intensified for its most vulnerable.

This report underscores the urgent need for reform of online spaces. CCDH’s STAR Framework argues that legislators must demand platforms embed safety by design, transparency of their algorithms and economic incentives, and accountability and responsibility for failure to enforce their terms of service and the harms their platforms perpetuate. Without oversight, TikTok’s opaque algorithm will continue to profit by serving its users – children as young as 13, remember – increasingly intense and distressing content without checks, resources, or support.

It should be clear that this report aims to examine TikTok’s role in recommending harmful content to vulnerable users – by no means should users who post about their mental health and experiences be shamed for sharing their experiences. For those affected by any of the issues discussed, please refer to this report’s content warning for additional resources and support.


Imran Ahmed
CEO, CCDH

Download our
Parents Guide