How the UK’s Online Safety Bill would help keep children safe
So, how could the Bill protect children and teens from online harms?
In this blog, you can learn more about how the Bill would work to stop the harms children and teens face on social media platforms. We’ll be looking at how these platforms fail to tackle bullying, harassment, and the promotion of eating disorders, amongst other harms.
How social media platforms are failing
CCDH’s research has highlighted the serious harms that children and teens face on social media. For instance, we investigated how Instagram creates a rabbit hole of harmful content. Instagram’s algorithm boosts further eating disorder content to people who have engaged with an eating disorder post or account. This is an issue psychiatrists have warned is a growing problem.
We’ve also exposed the harms that children and teens face on emerging platforms, including virtual reality (VR). Our report on Facebook’s VR Metaverse showed young people were frequently the victims of at least 100 incidents, including hate, harassment, and bullying. On top of that, we found how children and teens are being exposed to graphic sexual content, and groomed into repeating racist slurs and extremist language.
Time and time again, we see platforms failing to enforce their own community standards. As a consequence, children are exposed to harmful content that could certainly have an impact on their offline lives.
How could the Bill help?
The Online Safety Bill would make content that is harmful to children a high priority for social media platforms. They would need to put a range of measures in place to address such harmful content. This would include removing harmful content from their platforms and designing systems with child safety in mind.
Importantly, this means that platforms would have a duty to prevent children and teens from accessing some of the content most harmful to them, including:
- Content promoting self-harm or suicide
- Content promoting eating disorders
The Bill will achieve all this by regulating platforms, not users or their posts. If companies fail to comply with their new duties, the media regulator Ofcom will have the power to impose fines of up to £18 million or 10% of global annual revenue.
Why you should care
The Bill would introduce important protections for children and teens. Platforms likely to be accessed by kids will be forced to design their systems with child safety in mind, helping to protect them from damaging sexual abuse, self-harm and eating disorder content.
The passage of the Bill would be an important step towards making the internet safer for children, women, people of colour, and LGBTQ+ people.
For years, social media companies have put profit before people, maximising the money they make from users like us instead of keeping us safe. This contributes to problems like racist abuse, dangerous health misinformation, self-harm, and eating disorder content that can ruin children’s lives.
The Bill has the potential to change the balance, by introducing penalties for social media companies that fail to keep users like us safe. That’s a new incentive to put people’s safety before ever-growing profits.
CCDH will be campaigning alongside others to support and strengthen this critical Bill, ensuring the UK government delivers on their manifesto promise to make the internet safer for everyone, including our children.
Sign up today to be the first to learn about the Bill and its way forward.