TikTok bombards teens with self harm and eating disorder content within minutes of joining the platform

Posted on December 15, 2022 in Press releases.

Deadly by Design: TikTok pushes harmful content promoting eating disorders and self-harms into users' feeds.

Harmful content was served every 39 seconds to our test accounts

Center for Countering Digital Hate (CCDH) is today publishing a new report – “Deadly by Design”. The report reveals the scale and intensity with which TikTok is bombarding vulnerable teenagers with dangerous content that might encourage self-harm, suicide, disordered eating and eating disorders via its ‘For You’ feed.

CCDH researchers studied the TikTok algorithm by establishing two new accounts posing as 13 year-olds in the US, UK, Australia, and Canada. One account in each country was given a traditionally female name. The second account also had a traditionally female name but also contained the string of characters – ‘loseweight’. Research has shown that users with body dysmorphia issues will often express this through their username. We differentiate the account with ‘loseweight’ in our analysis as a ‘vulnerable’ account.

CCDH researchers then recorded the first 30 minutes of content automatically recommended by TikTok to these accounts in their “For You” feed, a section of TikTok that algorithmically recommends content to users. The algorithm refines the choice of videos as it gathers more information about the user’s preferences and interests. ​​The way this works is key to the success of the platform but like all Social Media, the algorithms that personalize “For You” can connect users to harmful content if the algorithms do not identify and downgrade potentially harmful content and if community standards on content are poorly enforced. The algorithm is proprietary to TikTok and operates in an opaque manner, lacking meaningful transparency.

CCDH found that TikTok is host to an eating disorder community using coded and open hashtags to share content on TikTok with over 13.2 billion views of their videos.

On the “For You” feed, our research team encountered numerous videos promoting potentially dangerous content about mental health, disordered eating, or self-harm. Every time videos on these topics, body image or mental health were encountered, researchers would pause and like it, simulating the behavior of a young adult who may be vulnerable to such content.

  • Our accounts were served videos about mental health and body image every 39 seconds on average in the study.
  • Content referencing suicide was served to one account within 2.6 minutes.
  • Eating disorder content was served to one account within 8 minutes.

When we compare standard to vulnerable accounts, our researchers were extremely disturbed to find that the volume of harmful content shown to vulnerable accounts (i.e. with the term ‘loseweight’ in their username) was significantly higher than that shown to standard accounts.

  • Vulnerable accounts were served 3 times more harmful content than standard accounts
  • They were served 12 times more self-harm and suicide videos than standard accounts.

CCDH is today launching this report with recommendations for TikTok and for legislators and regulators.

It is also today launching a Parents’ Guide, co-authored by Imran Ahmed, CEO of CCDH and Ian Russell, Chair of Trustees at the Molly Rose Foundation.

Imran Ahmed, CEO of the Center for Countering Digital Hate, said:

Speaking about the findings:

“TikTok was designed to dazzle young users into giving up their time and attention but our research proves that its algorithms are not just entertaining children but poisoning their minds too.

“It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food.

“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from Big Tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”

Speaking about the Parents’ Guide:

“Many parents will be familiar with that gnawing sense of dread they have on the sofa at night wondering what content their children are viewing while they are in their rooms. 

“Even younger parents who have a Pinterest Board and follow influencers on Instagram may well know nothing about how TikTok works, despite its enormous popularity and the extent to which their children use it.

“This guide seeks to close the generational gap in platform usage by informing parents about the potential harms TikTok might pose to their kids and gives them common sense recommendations on how to have healthier parent-child discussions on potential harms.” 

A Spokesperson for the Molly Rose Foundation, said:

“The Molly Rose Foundation endorses this report for its important work in highlighting the dangerous content promoting eating disorders and self-harm on TikTok.

“Exposing the underlying toxic content that infects so much of social media is vital in the battle to combat it.

“Self-regulation has failed in Big Tech and platforms which sell themselves as providing entertainment must pay heed to these disturbing findings.

“Through public scrutiny we must shine a spotlight on this harmful content and take affirmative action to help protect the vulnerable people exposed to it.”

Notes

  1. The Center for Countering Digital Hate (CCDH) is a US-based, not-for-profit NGO with 501(c)(3) tax-free status, that seeks to disrupt the architecture of online hate and disinformation by increasing the political, economic and social costs of the production and distribution of hate and disinformation. CCDH’s staff is headquartered in Washington DC. CCDH is independent and does not take money from Big Tech.
  2. Ian Russell is the father of Molly Russell and Chair of Trustees at the Molly Rose Foundation. Molly Russell was a 14 year old British girl who took her own life. The coroner at the inquest into Molly’s death reviewed content she had viewed on social media sites “concerning or concerned with self-harm, suicide or that were otherwise negative or depressing in nature”. The court concluded that this content, some of which had been proactively served to Molly by algorithmic recommendation engineers “affected her mental health in a negative way and contributed to her death in a more than minimal way.”
  3. The choice to add the term “loseweight’ to vulnerable accounts in our study is based on research from Reset:  “Designing For Disorder: Instagram’s Pro-Eating Disorder Bubble In Australia”, Rys Farthing, Reset Australia, 22 April 2022, https://au.reset.tech/uploads/insta-pro-eating-disorder-bubble-april-22-1.pdf