YouTube’s EU Anorexia Algorithm: How YouTube recommends eating disorder videos to young girls in Europe

Posted on February 03, 2025 in Press releases.

  • The Center for Countering Digital Hate (CCDH) investigated YouTube and its algorithm in the EU and discovered how troubling recommendations are shown to simulated 13-year-old users who watch a video about eating disorders for the first time.
  • YouTube is actively pushing eating disorder content — 1 in 3 videos recommended to a simulated 13-year-old girl’s account used in the study contained harmful eating disorder content.
  • YouTube is only applying crisis panels in 2 out of 27 EU countries.
  • Out of 100 recommendations studied by CCDH: 33 were for harmful eating disorder content and 72 were for content about eating disorders or weight loss.
  • YouTube failed to remove, age-restrict or label 22 of 28 videos (79%) reported with an EU account, despite claiming to do so in its DSA risk assessment.
  • “Every day this content is allowed to circulate, accelerated by powerful recommendation engines, more and more children are carelessly and unnecessarily put at risk,” said Imran Ahmed, CEO of CCDH.

WASHINGTON, D.C., February 3rd, 2025 – YouTube is actively pushing harmful eating disorder content to vulnerable adolescents in the UK.

In a new report, the Center for Countering Digital Hate (CCDH) investigated YouTube and its algorithm in the EU for how recommendations are pushed to a fictional 13-year-old user who watches a video about eating disorders for the first time.

Researchers created a new YouTube account presenting as a 13-year-old girl based in the EU who watches a video about eating disorders. CCDH analyzed a total of 100 recommendations collected using this approach. The EU report’s findings are troublingly consistent with another current CCDH study about YouTube video recommendations in the UK as well as a recent CCDH study analyzing 1,000 videos presented to simulated users based in the United States.

Videos were sorted into the following categories:

  • Harmful eating disorder content breaching YouTube’s policies
  • Other eating disorder content
  • Weight loss content
  • Unrelated content

1 in 3 videos were about harmful eating disorder content — content that directly violates YouTube’s policies.

Example eating disorder trends recommended by YouTube:

  • Anorexia Boot Camp — videos detailing a 30-day food regime with daily limits of 0-500 calories intended to induce anorexia
  • Thinspo and skeletal imagery — videos featuring emaciated bodies to inspire thinness
  • What I Eat in a Day — videos showcasing the daily eating habits of those with eating disorders, often depicting extreme calorie restriction

YouTube uses what it calls “crisis resource panels” to connect users who may be at risk to third party support services.

We found that crisis resource panels only appeared when tests were carried out in 2 out of 27 EU countries: France and Germany. When accessed from the remaining 25 countries, the panels didn’t appear at all.

In addition to promoting these troubling topics, YouTube is actively hosting advertising from global brands next to videos about eating disorders. Ads such as Salesforce and Hydepark Environmental were displayed.

Imran Ahmed, CEO of the Center for Countering Digital Hate, said:

“YouTube is profiting by preying on vulnerable children. Young people are impressionable, and YouTube’s decision to not only host but promote this content next to high-profile advertisers is reprehensible and potentially deadly.

“YouTube has long been a popular place where children and young adults find content. The company’s failure to enforce its own policies despite their claims to specifically protect children from exactly this kind of content, represents a gross failure of duty and an abdication of good corporate citizenship. It is clearer than ever that companies will not self-regulate. YouTube is yet another clear example of why we need to raise awareness and alert young people and their families to the dangers of using these platforms, as we press for greater transparency and accountability.

“YouTube’s risk mitigation reports on DSA compliance are incomplete, or at best lacking. CCDH research shows that crisis resource panels are in force in only 2 of 27 EU countries. How can a platform that claims to be global leave most of Europe out in the cold? What we need is responsible and consistent behavior, not empty or misleading promises.

“Every day this content is allowed to circulate, accelerated by powerful recommendation engines, more and more children are put at risk. It is time to put an end to a perverse system that encourages profit at the expense of our children.”

Tom Quinn, Director of External Affairs at Beat, said:

“It’s incredibly worrying that harmful content is being relentlessly pushed out to vulnerable young people. It’s even more concerning that YouTube doesn’t appear to be acting on removing the content, and in some cases is profiting from it. Whilst so-called “pro-ana” content won’t cause an eating disorder on its own, it could exacerbate existing symptoms or cause those affected to copy the harmful behaviours they see, making them more unwell. People have also reported feeling compelled to watch content like this when it appears, particularly if it does so without warning.

“We know that YouTube and other social media platforms can be a great source of comfort for our community, and creators and channels who champion recovery can be a real force for good. YouTube should be a safe space, not somewhere where people are exposed to more risk. We urge YouTube to remove this content immediately, introduce immediate safeguards to ensure that harmful content is stopped before it’s uploaded, and to ensure paid advertisements are not being placed on these kinds of videos”

Notes to Editor:

FOR IMMEDIATE RELEASE: Here is the link for the full EU report, we also invite you to follow this link to read the UK report.

To arrange an interview please contact [email protected]