How the UK’s Online Safety Bill would help tackle racist hate online
Last week we set out why we needed the UK’s Online Safety Bill and how it would work. But what would the Bill mean day-to-day for users who face online harm?
We’re going to start unpacking what the Bill could achieve by looking at our research over the last several years and seeing what harms we identified that an Online Safety Bill might have prevented.
In this blog, we focus on how the Bill would work to tackle racist hate, in particular looking at research we’ve carried out on antisemitism, anti-Muslim hate and racist abuse directed at the England footballers Marcus Rashford, Bukayo Saka and Jadon Sancho following the Euro 2020 final.
Platforms empty promises on racist hate
All social media platforms claim to act on racist hate. It is explicitly banned in every large platform’s ‘Community Standards’. But our research has shown that even when extremist hate is reported to them using their own tools they don’t do much.
Platform’s failings on hateful content received enormous attention around the world after a wave of racist abuse was directed at the England football national team’s Marcus Rashford, Bukayo Saka and Jadon Sancho following the Euros 2020 cup final. Our analysis of racist abuse directed at players on Instagram in the aftermath of the match showed that 94 per cent of accounts reported for racially abusing players had not been removed after 48 hours. A second audit conducted six weeks later showed that 75 per cent of the accounts had still not been removed.
So the case couldn’t be clearer – platforms are failing on racist hate, even when it is brought to their attention, and this is why we need legislation like the Online Safety Bill to compel them to action.
How would the Online Safety Bill help?
The Online Safety Bill requires platforms to have systems in place to prevent the widespread publishing of some illegal racist content, and to remove it when it is reported to them.
The Government has also announced that platforms will have to publish standards on “online abuse and harassment” and be transparent with users about what action they take on harmful racist content that breaches their standards.
The Bill ensures that platforms are accountable to a regulator, that there is a proactive ‘duty of care’ to stop illegal content and protect children from racist content, and that there is transparency in tackling other harmful racist content.
The regulator, Ofcom, will be able to:
- Set guidance in their Codes of Practice, using their powers to review and audit big tech and consider super complaints where applicable.
- Penalise companies that fail to comply.
- Impose fines of up to 10% of global annual revenue – a staggering amount for platforms such as Facebook.
What effect would this have had?
When looking at online harms we have identified and studied in our research, it means on a practical level that platforms would have had a duty to stop the spread of illegal racist content that targets Jewish and Muslim social media users, as well as black football players like Marcus, Bukayo and Jadon. The Bill would provide greater transparency for users on whether platforms are actually keeping their promises on racist hate that breaches their standards, making it easier for us to hold them accountable.
This should mean that people who receive abuse will have a better complaints system with clearer policies that are enforced more consistently.
Social media companies are putting profit before people, maximising the money they make from users like us without doing the bare minimum to keep us safe. That affects all of us, contributing to problems from racist abuse, to dangerous health misinformation to self-harm and eating disorder content that can ruin young people’s lives.
The UK’s Online Safety Bill has the potential to change the balance.
Check back again soon and we will be explaining how the Online Safety Bill would have an impact on the hate and misinformation we find every day.
In the meantime, you can sign up to receive updates about our campaigns.