Refresh

This website counterhate.com/blog/2024-uk-general-election-4-ways-the-next-government-can-make-the-internet-safer/ is currently offline. Cloudflare's Always Online™ shows a snapshot of this web page from the Internet Archive's Wayback Machine. To check for the live version, click Refresh.

2024 UK General Election: 4 ways the next government can make the internet safer

Posted on June 14, 2024 in Explainers.

UK General Election: how the next government can improve online safety

The United Kingdom will elect its next government on July 4th. The last Parliament succeeded in passing the Online Safety Act and establishing a world-leading regulatory regime, but failed to get rights for bereaved parents to access their child’s data and rights for independent researchers to access platform data over the line. Without closing these loopholes, the ambitions of the Online Safety Act will be impossible to realise. 

So as the UK heads to the polls, CCDH lays out the challenge for the next government: build on the success of the Online Safety Act, increase data access to close loopholes that undermine the regime, address the economic underpinnings of online hate by reforming digital advertising, and go further to safeguard the digital ecosystem of the future. 

Ensuring the success of the Online Safety Act

CCDH supported the Online Safety Bill from its inception. Our Chief Executive Imran Ahmed was the first witness before the Public Bill Committee and was one of the external stakeholders quoted by the Government in its press release announcing passage of the legislation.  

Now that the Act is law, we are working to support the regulator Ofcom to get the regulations up and running. Consultations on Ofcom’s regulatory proposals will continue to 2025, but already civil society groups and the victims of online harms are pressing Ofcom to be bold in its ambition for online safety and to ensure that the systems and processes prioritized by the Act deliver the wider, societal harm reductions the Act was intended to address.  

While the Online Safety Act is enforced by Ofcom as the independent communications regulator, the Secretary of State for Science, Technology and Innovation will have important responsibilities for secondary legislation and as a statutory consultee to the regulatory regime. The next Secretary of State must be clear that digital reform is a priority for their government, commit to legislation that plugs the gaps in the Online Safety Act, and affirm that  the harms wrought by unaccountable technology companies will not be allowed to continue.

Increase transparency by granting data access to researchers  

During passage of the Online Safety Bill, CCDH led a campaign to increase data access rights for independent researchers. This campaign led to government amendments requiring Ofcom to report on researchers’ access to data and publish guidance to facilitate data access.  

While this report by the regulator will be a vital tool towards building a robust data access process, it won’t come until 2025. In that time, social media companies have been moving swiftly to shut down existing tools for researchers: Meta announced its intention to shut down the CrowdTangle insights tool in August 2024; Twitter ceased providing free research access to its Application Programming Interface and erected prohibitive cost barriers of $42,000 per month; TikTok appears to have removed viewership data from harmful hashtags in its Creative Center.  

Worse still, researchers who criticise platforms based on their data findings are facing increased legal intimidation and attacks. CCDH was targeted by Elon Musk’s X with a baseless lawsuit in a brazen attempt to intimidate and censor CCDH for its reporting on hate speech and misinformation on its platform. While this lawsuit was dismissed, its chilling effect on independent researchers will succeed without proactive intervention from policymakers.  

For these reasons, the next UK Government must legislate a statutory data access pathway for independent, vetted researchers and create legal protections for those who undertake public interest research.

2024 UK General Election: data access to researchers
Photo by Scott Graham on Unsplash

The next reformation: digital advertising

CCDH has researched the economic underpinnings of online hate and shown how, from climate denial to identity-based hate speech, profit motives and opaque systems have created a lucrative market for hate actors.  

In March, we showed how YouTube makes millions in ad revenue hosting and advertising against climate denial content. In our report Hate Pays, we documented how social media accounts posting amid the Israel-Gaza conflict leveraged hate to increase engagement, follower numbers, and profitability. 

Across these and many other topics, the opacity of what types of content are being supported by advertiser dollars, and how social media accounts are leveraging hate to reap the financial rewards of that opacity, underpins the flourishing ecosystem of online hate. Policymakers must address the financial incentives propelling hate actors as a priority to reforming the digital world and ensuring safety.

Safeguard the digital ecosystem of the future 

Since the launch of generative artificial intelligence platforms like Chat-GPT, questions about the future governance and societal impact of emerging technologies have been at the top of the policy agenda. The Online Safety Act narrowly applies to AI content if uploaded onto the social media platforms and search services regulated by the regime.  

This after-the-fact application does not provide the proactive safeguards needed to avoid repeating the mistakes made during the rise of the social media giants. A lack of regulation allowed those technology companies to grow to enormous size without safeguards on their products.  

This delivered them enormous profits but wrecked our information ecosystem at tremendous societal cost. The next generation of technology companies cannot be allowed to do the same.   
 
The next UK government needs to institute common sense safeguards on AI companies, ensuring their products are tested for safety prior to public launch. CCDH research has already shown the lack of priori testing on artificial intelligence tools such as image generators and voice cloning while others have raised the risks of AI chatbots being used by terrorists to radicalise and recruit adherents.

Our pledge 

CCDH has produced an indispensable guiding framework for the construction of future regulations – the CCDH STAR Framework. STAR stands for Safety by Design, Transparency, Accountability, and Responsibility – the fundamental components of technology regulation.  

Whichever party or parties form the next UK government, CCDH will continue to advocate for a reformed digital ecosystem that prioritises truth, safety, and human rights around the world.  

To stay up to date with social media companies’ regulation advances, sign up to our mailing list to receive the latest on CCDH’s campaigns for social media reform to make the internet a safer place. 

Email community sign-up

This field is for validation purposes and should be left unchanged.