Biden Administration’s Principles could help make the internet safer for all

Posted on September 09, 2022 in News.

Two small children playing with a virtual reality set

We welcome the White House’s new Principles for Enhancing Competition and Tech Platform Accountability. Many of these Principles are reflected in our STAR Framework for measuring global efforts for legislative reform – which we have brought to the attention of the US Government and legislators alongside extensive research showing harm from the Wild West unregulated environment.

Text reads: Safety by design, Transparency, Accountability, Responsibility

For many years, countries have rallied around the notion that the Internet was neutral and needed to be protected from third-party liability in order to foster innovation and growth for a new industry. This produced a few decades of regulatory ambivalence and the international community adopting a “hands-off” or at best an individual content-based approach to regulating online harm in some jurisdictions, with the technology companies seen as neutral actors in this environment. Tech companies were encouraged – through this permissive regulatory environment without checks and balances – to adopt an aggressive profit-driven business strategy that followed a “move fast and break things” maxim.

Unfortunately, the things that were being broken were our kids, our democracy, public health, and social cohesion. While Big Tech was making record profits, the costs and externalities were being borne by members of the public, who were subject to a toxic information ecosystem and largely unfettered harassment and abuse without redress – particularly women and minorities.

Through our work at CCDH, we have developed a deep understanding of the online harm landscape. Since 2016, we have been researching the rise of online hate and disinformation, and have shown that malicious actors are able to easily exploit the digital platforms and search engines that promote and profit from their content. CCDH has studied the way anti-vaccine extremists, hate actors, climate change deniers, and women-haters weaponize platforms to spread lies and attack marginalized groups. Through our work, we have seen the depth and breadth of harm that tech companies are profiting from on a daily basis, including:

“We have seen the depth and breadth of harm that tech companies are profiting from on a daily basis.”

Platforms are failing us

What has remained consistent, across all types of harmful content, is an absence of proper transparency and a failure of platforms and search engines to act. Our research and advocacy work shows repeated failures by social media companies to act on harmful content or the individuals/networks who are sharing it. We have shown how the companies’ algorithms – with a systematic bias towards hate and misinformation – have had a damaging impact on our information ecosystem.

The failure of social media companies to act on known harmful content connected with terrorism, racism, misogyny, and online hate is a violation of their own terms and conditions. The pledges made to an international community when the cameras were rolling, and the inherent dignity that the victims of tragedies like Buffalo, Christchurch and Myanmar were entitled to – the right to live safely in their communities and to be safe from extremist, racist terrorism. This failure to act is the reality of the self-regulation environment. Self-regulation means no regulation.

“Self-regulation means no regulation.”

After our first Global Summit to address Online Harms and Misinformation held in Washington D.C. in May 2022 where we discussed the actions taking place in each country to counter hate and lies online, we saw the need to develop the STAR Framework to support global efforts to regulate social media and search engine companies.

The STAR Framework looks at four key standards for social media reform, to ensure effectiveness, connectedness and consistency for a sector whose reach impacts people globally.

STAR Framework and the Principles

Safety by Design. Safety by design means that technology companies need to be proactive at the front end to ensure that their products and services are safe for the public (including children). Rather than waiting for problems to happen and then sending in an ambulance to deal with multiple individual issues, safety by design principles mean adopting a preventative systems approach to harm.

Transparency. Transparency is desperately needed in three key areas:

  • Algorithms
  • Rules enforcement
  • Economics (particularly related to advertising)

Accountability to democratic and independent bodies. Regulation is most effective where there are accountability systems in place for statutory duties and harm caused, particularly where there is a risk of inaction because of profit/commercial factors. Accountability systems could include an enforcement and independent pathway for challenging decisions or omissions.

Responsibility for companies and their senior executives. Responsibility means consequences for actions and omissions that lead to harm. A dual approach – holding responsible both companies and their senior executives – is a common intervention strategy for changing corporate behavior.

The Biden Administration’s Principles also look into building safeguards for children and privacy, transparency on algorithms and rules enforcement, and holding Big Tech accountable by removing special protections for large platforms under section 230.

Girl holding a phone

The impact is real – on people, communities and democracy.  We cannot continue on the current trajectory with harmful individuals creating a toxic disinformation ecosystem and a broken business model from Big Tech that drives offline harm.  We need to reset our relationship with technology companies and collectively legislate to address the systems that amplify hate and dangerous misinformation worldwide. The STAR Framework draws on the most important elements for achieving this: Safety by Design, Transparency, Accountability and Responsibility.

We urge Congress to boost their efforts to legislate according to the STAR Framework and the Biden Administration’s Principles.