Coalition of charities and advocacy groups demand action from TikTok on the mental health and well-being of children.
Below you can find the full letter and signatories from leading charities and advocacy groups in the US and the UK have written to Eric Han, Head of Safety TikTok, to demand action on eating disorder, body image, and mental health content on the platform.
Eric Han, Head of Safety TikTok
5800 Bristol Pkwy
Dear Mr. Han,
We write to you as concerned researchers, activists and parents regarding the damaging effect of your platform’s content algorithm on the mental health and well-being of children. We believe it is your responsibility to take swift and decisive action to address this issue.
A recent report by the Center for Countering Digital Hate (CCDH) highlights the reality that TikTok’s algorithm feeds dangerous and harmful content to young people, negatively impacting their mental health and well-being. Within 2.6 minutes, TikTok recommended suicide content to the teen accounts created by CCDH researchers. Within just 8 minutes, TikTok served content related to eating disorders. And every 39 seconds, TikTok recommended videos about body image and mental health.
TikTok is responsible for ensuring its users’ safety and well-being, especially young people who are particularly vulnerable to the negative impacts of such content. This includes protecting them from exposure to harmful content that can lead to adverse outcomes such as increased anxiety, depression, and even suicide. We are calling on you to take meaningful action to address these issues.
This could include but is not limited to the following:
- Strengthening your content moderation policies to better address harmful eating disorder and suicide content.
- Working with mental health experts and advocacy organizations to develop a comprehensive approach to identifying and removing harmful content.
- Providing resources and support to users who may be struggling with eating disorders or thoughts of suicide.
- Increasing transparency and accountability by regularly reporting on the steps you are taking to address these issues and the impact of those efforts.
Since CCDH’s report was released in December 2022, you have chosen to deny the problem, deflect responsibility, and delay taking any meaningful action. TikTok has removed just seven of the harmful eating disorder hashtags we are tracking, leaving 49 active. Many of these hashtags do not carry links to resources for vulnerable users. These hashtags have received 14.8 billion views, as of January 2023, an increase of 1.6 billion views since we published our report.
You were presented with clear harms but continue to turn your backs on the young users you claim to protect. Your silence speaks volumes.
We urge TikTok to take immediate and effective measures to address this issue in order to prevent tragedies from occurring in the future.
Center for Countering Digital Hate
Academy for Eating Disorders
American Psychological Association
Eating Disorders Coalition for Research, Policy & Action
Friends of the Earth
Health Care Voices
LOG OFF Movement
Media Matters for America
Molly Rose Foundation
National Center of Excellence for Eating Disorders
ProgressNow New Mexico
Tech Transparency Project
The Real Facebook Oversight Board
The Tech Oversight Project
United Church of Christ Media Justice Ministry
United We Dream
Youth Power Project