CCDH’s CEO Imran Ahmed at the 2023 Eradicate Hate Global Summit
On September 29, our CEO Imran Ahmed discussed social media regulation at the 2023 Eradicate Hate Global Summit, the world’s leading gathering of anti-hate experts. In a panel with Dr. Murtaza Shaikh (Ofcom) and Sasha Havlicek (Institute for Strategic Dialogue), Ahmed analyzed the European Union Digital Services Act and the UK’s Online Safety Bill, and highlighted the need for the US to approve similar legislation.
“In the US, [social media companies] have an absolute get-out-of-jail-free card and we hope that transparency, which allows for meaningful accountability and then the sharing of responsibility. So it’s not just us that bear the entire cost of the failure of these platforms to enforce their rules. But through that meaningful response, through the sharing of responsibility, we will get a culture of safety by design,” said Ahmed.
Read the transcript of Ahmed’s speech:
My name is Imran Ahmed. I’m the chief executive of the Center for Countering Digital Hate, we’re a US 501c3 that studies and more importantly, disrupts the production and distribution of the online hate and lies that we know underpin so many tragedies around the world, including the one that started my journey into this work seven years ago in the United Kingdom, during the EU referendum, when my colleague Jo Cox MP was murdered by a far-right terrorist who believed in a version of the great replacement theory.
And that’s the connective tissue, isn’t it, Laura? Between you, me, and the grief that started us on both of our journeys. You to set up Eradicate Hate, me to set up the Center for Countering Digital Hate some years earlier. And of course, it did as well with Christchurch. The same conspiracy theory created, promulgated, weaponized, operationalized, and turned into real-world violence starting on those platforms.
I spent 3 years after 2016 also seeing the rise of antisemitism on the left in the Labor Party, a party for which I worked at the time as a special adviser to the Shadow Foreign Secretary. I worked with the platforms and I see the platforms here today. I see some of the platforms that I worked with 7 years ago. At the time, however, my experience was that whereas they welcomed the information I was giving them about the problems that were being fomented on their platforms, they did not take the action that I felt was required to deal with it.
And that’s why we’ve gotten to the point that we are at today where we need legislation, we need a regulatory backstop to ensure that action is taken. CCDH’s perspective developed over 7 years and working with both the British government, the EU, as well as with the US government where, you know, I live in Washington, DC, a few blocks from the Capitol.
I was the first witness to give evidence to the opening witness at the online Safety Bill Committee in the UK 2 years ago, September 14th, 2021. And what we said then is what I’ll say to you now we need in the US. We need a regulatory guarantee of four things, none of which impedes freedom of speech, none of which impedes their ability to make lots and lots of money. And people have the right to make lots and lots of money. Mark Zuckerberg made lots of money. Elon Musk is probably losing money, but he may make money eventually.
Starts with transparency. You can’t have a debate. You can’t have a dialog unless there’s honesty. Honesty on how they enforce their rules, the content enforcement, honesty on the algorithms. I’m going to do something very dangerous right now. I’m going to say Laura was slightly wrong when she talked about the algorithm. Algorithms don’t give you more of what you already like. They give you more of what they want you to like. It’s really different. We did a study called Malgorithm a couple of years ago where we studied during the pandemic: If people are following anti-vaxxers, what do they get?
They get served not more anti-vax content, they get QAnon content, they get antisemitic content because they know that conspiracy theories – conspiracism is driven by epistemic anxiety. And if you like one conspiracy theory, you’re more likely to like another. So they start getting you in different rabbit holes because they want you buried at the bottom of a conspiracist warren out of which you won’t emerge and you keep consuming content on their platform.
That’s the way that the algorithm, which isn’t a human being, it’s an unfeeling, unemotional, and more importantly, completely opaque machine. So we want transparency of these algorithms. We want transparency of the advertising. Here’s the other dirty secret about the companies. They’re not in the free speech business. They’re in the advertising business. 98% of Meta’s revenue comes from advertising.
A big problem, by the way, with the UK’s Online Safety Bill and with the EU’s DSA is that they fail to address that. But actually, these aren’t free speech companies, they’re advertising companies. How does advertising distort the lens that they provide for us on the world’s speech? Because of course, it does distort that lens. There’s a commercial imperative that underpins it, and we have the right to understand, you know, if they are free speech platforms, if they’re platforms in which we can see what the world is saying, are you distorting it to favor your advertisers?
So transparency is vital. Without transparency, we can’t have meaningful accountability. And that’s where we are in the US right now. Of course, we’ve all seen the congressional hearings where people go in front of Congress and talk, and Mark Zuckerberg asks questions, and he answers questions, some of which are profoundly ill-informed. But that’s partly because there’s no transparency.
So accountability to democratic bodies and I’m so glad that Murtaza (Ofcom) is here. I’m going to see his boss next week in London to talk about how they’ll be using their new powers. And Murtaza is one of the first generation, the first, you know, of the people who will have real, genuine, democratically empowered ability to hold these big companies to account.
And when they’re found to have failed, it’s important that they take responsibility. Now, Laura talked about something called Section 230 of the Communications Decency Act. That is a really weird law. Every company in this country is subject to negligence law apart from social media companies, they’re essentially free from the negligent building of how their platforms work. Product design. And that means that they can’t be sued if they cause harm. And they know that, you know, in a situation where they know that bad actors are congregating on their platforms, using their tools to operationalize, you tell them about it, they do nothing about it. A terrorist attack occurs, people are killed. There’s nothing that you can do about it through the civil courts, nothing.
That’s an extraordinary protection that was given to the early Internet. The Communications Decency Act is from 1996. It was designed for bulletin boards and the under-the-article comment sections of news websites, not for social media companies because they didn’t exist at the time. So responsibilities, they’re able to impose fines. The EU and UK have now got the powers in those situations that I just described to you to impose fines of up to 10% of revenues, billions of dollars.
And of course, in the US they have an absolute get-out-of-jail-free card and we hope that transparency, which allows for meaningful accountability and then the sharing of responsibility. So it’s not just us that bear the entire cost of the failure of these platforms to enforce their rules. But through that meaningful response, through the sharing of responsibility, we will get a culture of safety by design.
I want to speak briefly about what I saw. Many of you may have seen Meta and TikTok presented earlier this week, and I loved what they produced with Inge. It was the first, you know, I’m a weeper. I cry at Paddington, the movie. But it was the first time I cried at this conference was watching that. It was beautiful what they did. But it’s not enough.
Meta in that time has shed half of its Trust and Safety staff. The people who are responsible for making sure that things like Pittsburgh don’t happen again. It’s really important that we never forget, and I’m so glad that they helped us to make sure that we never forget the Holocaust. But we must never forget that these vast businesses have been responsible for genocides elsewhere in the world, like Myanmar.
But they are also in part responsible for creating the enabling environment in which people committed atrocities that bring us all here today. So I’m very glad that we’ve legislated in the UK and EU now, and I know there are US Government people in the room. Over to you now.