YouTube’s Anorexia Algorithm

How YouTube recommends eating disorders videos to young girls

Youtube's Anorexia Algorithm Cover

New research by CCDH shows that YouTube pushes dangerous videos to young girls including eating disorder and self-harm content.

Download report Find out more

About

An intro from CCDH CEO Imran Ahmed

I have long been honored to call Ian Russell and Kristin Bride my friends, even if I wish my work at CCDH hadn’t brought us together. Ian and Kristen, like far too many parents around the world, share the trauma of having lost a child to the dangers of social media. 

Seven years ago, Ian’s daughter Molly – just fourteen at the time – took her own life after being exposed to an algorithmically-accelerated spiral of online content that normalized and encouraged her to self-harm. Kristin’s son Carson, who was 16, took his life after he was cyberbullied on Snapchat. 

Hearing Ian and Kristen’s stories of grief, I cannot understand how any social media company would not feel compelled to immediately pull the emergency break and fix their platforms so no other child falls victim to the same fate. 

While I thought I understood Ian and Kristen’s pain before, now that I have children, I know I was only beginning to fathom the heartache they endure every day. 

I have experienced many so-called “important” moments – leaving my family to go to college, graduating, my first day at work, my marriage, turning various milestone birthdays – but none compares to the moment we welcomed our first child into the world.  

The overwhelming rush of responsibility, and truthfully, the terror of being unprepared. Am I ready to be a father? Am I ready to parent a child in our complicated, ever-changing world? I whispered to my little one in the delivery room that being a father is now my life’s purpose. 

So, when I speak with parents who have been through the unthinkable, whose agony could have been prevented if technology companies had taken responsibility for potential harm generated by their platforms, it reinforces to me the urgency of our mission. We must all make this promise collectively to future generations: to protect them. It is the cornerstone of civilization. 

This new report is a devastating indictment of the behavior of social media executives, regulators, lawmakers, advertisers, and others who have failed to abide by this collective promise by allowing eating disorder and self-harm content to be pumped into the eyeballs of our children for profit. It is a clear, unchallengeable case for immediate change. 

Nine out of ten teens in the United States use YouTube, a fifth of them “almost constantly.” It is used by far more young people than TikTok or Snapchat. At the same time, around the world, we are experiencing a crisis in mental health for young people. The number of children developing eating disorders has increased significantly in several countries, and there’s evidence that social media is contributing to the problem. Between the years 2000 and 2018, the global prevalence of eating disorders doubled. In 2021, the US Centers for Disease Control found that 1 in 3 teen girls seriously considered attempting suicide, up 60% from the previous decade.

YouTube has acknowledged the problem in the past and claims to try to avoid contributing to it, but our research shows they have fallen far short. CCDH put it to the test: we examined the recommendations that a teen girl would receive when watching an eating disorder video for the first time. All that YouTube knew about our test accounts was that this was the account of a 13-year-old girl with no prior viewing history. Its algorithm would determine what this girl would see across 1,000 video recommendations. What we found will chill you to the bone – and shows just how at risk all children who use these platforms are of deadly consequences. 

If a child approached a health professional, a teacher, or even a peer at school and asked about extreme dieting or expressed signs of clinical body dysmorphia, and their response was to recommend to them an ‘anorexia boot camp diet’, you would never allow your child around them again. You’d warn everyone you know about their behavior. 

Well, that’s precisely what YouTube did – pushed this user towards harmful, destructive, dangerous, self-harm-encouraging content. 

One in three recommendations were for harmful eating disorder videos that could deepen an existing condition or anxieties about body image. 

Two in three were for eating disorder or weight loss content. 

And then, as if encouraging eating disorders weren’t enough, YouTube sometimes pushed users to watch videos about self-harm or suicide. 

Beyond merely pushing related content, YouTube’s algorithm has a deep, sick sense of a teen’s vulnerable psychologies, and in its infinite corporate wisdom, YouTube encourages users to watch this content – all in search of ever greater profits. 

Next to these videos – videos about how to starve yourself, or even how to harm yourself – YouTube ran ads from prominent multinational corporations, making money from them. YouTube is placing ads for Nike, T-Mobile and Hello Fresh next to this harmful content. 

Of course, when we reported these horrific videos to the platform, YouTube failed to remove or age-restrict them 4 out of 5 times. And sickeningly, this is legal. 

Most parents will be staggered to learn that not only is this lawful, but that YouTube is explicitly protected from liability for this behavior by U.S. law – specifically Section 230 of the Communications Decency Act 1996. 

YouTube itself has made promises to parents that our data proves were, in reality, worthless. They claim to remove harmful eating disorder content – they don’t. They claim to age-restrict this content – they don’t. They claim to make “responsible” video recommendations – they do not.

As a parent, and as a human being of the world, this scares me. I promised to keep my children safe, but how can I be sure of keeping that promise when they have access to YouTube? How can I be sure that algorithmically accelerated, monetized disordered eating and self-harm content will not encourage an outcome that no parent wants to even contemplate? We must hold YouTube to account. This report gives us all the evidence we need to take action. 

It starts with demanding more from social media platforms, but also from the advertisers who contribute most of their revenues and the legislators who claim they have our backs but have sat back and done nothing while intersecting crises in mental health, suicide and eating disorders devastate the lives of our children. 

Imran Ahmed 
CEO, Center for Countering Digital Hate