The UK’s Online Safety Act: A Year On
The United Kingdom’s Online Safety Act became law one year ago on 26 October 2023. This important piece of legislation set the rules for social media and search engine platforms operating in the UK, putting online safety regulation at the forefront.
The Center for Countering Digital Hate (CCDH) believed in the Online Safety Act from its inception. Our CEO was the first to give evidence in support of the bill before Parliament. Now passed, CCDH is advising Ofcom, the UK’s online safety regulator, on how to ensure that safety is a fundamental part of being a digital leader.
On its first birthday, CCDH is asking: why is the Online Safety Act important? How will it protect me online? And what is next for social media reform?
Why is the UK Online Safety Act important?
The Online Safety Act (OSA) makes the UK one of the first countries to regulate global social media companies like Facebook, YouTube, Google, X, and TikTok. The UK’s leading regulation can be an example to other countries as they seek to hold social media giants accountable and strengthen online safety. Making sure the OSA is a success is one of CCDH’s top priorities. Our research has shown how unaccountable social media companies enable and amplify online hate, disinformation, and harms, highlighting the need for urgent change.
Now, CCDH is providing the key evidence upon which Ofcom can act to enforce the rules of the OSA, in submissions on illegal online content, the protection of children, and transparency reporting. The Online Safety Act aims to make the UK “the safest place in the world to be online”. CCDH thinks this goes even further: the Online Safety Act is an example to improve online safety legislation all over the world, with its principles adapted to global contexts. If it succeeds, it will make the UK a global online safety leader.
How will the Online Safety Act protect me online?
The Online Safety Act sets the rules that social media platforms and search engines in the UK have to follow. Platforms are required to identify and remove illegal content, conduct risk assessments, and provide a higher level of online protection for children.
Although the Online Safety Act was passed in October 2023, its rules haven’t come into force yet. This is because Ofcom is still drafting the guidance platforms need to make sure they’re correctly following the rules.
Once the rules come into force, what differences can UK users expect to see on social media platforms?
1. Removing and reducing illegal content
The first change users will notice is related to illegal content like violence, fraud, suicide, and revenge porn. Ofcom’s Codes of Practice on illegal content will be published in December 2024.
Simply, platforms must put in place plans to stop illegal content flowing across the platform, meaning that users should encounter less illegal content online than they did before. Social media platforms have to adhere to those rules first or face big fines if they fail.
2. Protecting kids from harmful content
The next change will come for children online. The OSA says that platforms have to take extra steps to protect kids from content that is harmful to them, but is legal for adults to access. This includes restricting access to eating disorder and self-harm content, and making it harder to access adult content like pornography.
The technologies that platforms use to suggest content to users, their algorithms and recommender systems, will have to stop promoting harmful content to children. Bullying, dangerous online dares, and viral harmful challenges will need to be clamped down on. Platforms must be clear about their age requirements and enforce any age limit consistently across their service.
3. Giving adults more choice over what they see
Platforms will provide tools and new options for adult users, giving more control over what they see and engage with online. User empowerment tools can include filters, opt-ins and opt-outs, and settings that affect the categories of content that is promoted to us by the platform. It will be easier to block strangers from messaging and viewing your profile.
Overall, it should be clearer to users what type of content is allowed on each different platform they use, meaning they can make more informed choices about which platforms they want to be present on.
4. Criminalizing violence and threats online
The OSA also created new criminal offences. It is now a criminal offence to encourage or assist in serious self-harm, send false information intended to cause harm, send genuinely threatening messages, or share intimate images of another person without their consent.
If an individual does any of the above online, they can be investigated by the police and prosecuted in court. These new offences were used by prosecutors to charge individuals related to their social media posts around the July 2024 UK riots.
5. Increasing transparency for researchers, journalists, and parents
The largest social media platforms will have to compile annual transparency reports, including information about how well their safety measures are working and what plans they have to improve.
Ofcom will then produce a summary of all the platforms’ responses, comparing services and analyzing trends so that any user, whether a journalist, researcher, or parent, can better understand the online harms ecosystem.
6. Extra steps to avert violence against women and girls
The Online Safety Act recognizes that women and girls are disproportionately affected by harmful online interactions and requires Ofcom to give guidance to platforms on preventing harmful gender dynamics on social media.
Ofcom will be conducting a review of online harms affecting women and girls and publish official guidance for social media platforms and search engines to address it.
After the OSA, what’s next for social media reform?
The Online Safety Act is an enormous step forward on the path to a safer digital world. But there are other areas in need of reform.
The riots which followed the Southport stabbings in July 2024 revealed gaps where online hate led to offline violence. CCDH analysis of social media activity following the attack identified key elements that are unevenly covered by the OSA:
- The viral spread of unsubstantiated claims
- Recommendation systems exposing falsehoods to ever wider audiences
- The role of far-right influencers with massive followings exploiting crisis
To address this, CCDH held a convening with government, regulators, frontline responders and targeted groups to draft a plan of action to strengthen the OSA. The resulting recommendations include:
- Legislating for mandatory data access for researchers
- Reintroducing assessment and reporting requirements for misinformation
- Addressing the commercial incentives behind false and incendiary information through digital advertising reform.
The UK Online Safety Act as a digital safety leader
Happy Birthday, Online Safety Act. You are a momentous step towards a safer digital world. Your success will make you a shining example of how to reform and regulate social media companies. But the work is not done.
CCDH will continue to press for further rules on online safety, including advertising reform, addressing the spread of false information, and increasing transparency for independent researchers.
If you support CCDH and online safety, join our email community to receive updates on the Online Safety Act and our latest research.