What we think about the UK government’s ‘Online Harms’ white paper

The UK government has just outlined its plans for sweeping new laws aimed at tackling illegal and harmful content and activity online, described by the government as ‘the toughest internet laws in the world’. While the UK proposal has some promising ideas for what the next generation of content regulation should look like, there are several aspects that would have a worrying impact on individuals’ rights and the competitive ecosystem. Here we provide our preliminary assessment of the proposal, and offer some guidance on how it could be improved.

According to the UK white paper, companies of all sizes would be under a ‘duty of care’ to protect their users from a broad class of so-called ‘online harms’, and a new independent regulator would be established to police them. The proposal responds to legitimate public policy concerns around how platforms deal with illegal and harmful content online, as well as the general public demand for tech companies to ‘do more’. We understand that in many respects the current regulatory paradigm is not fit for purpose, and we support an exploration of what codified content ‘responsibility’ might look like.

The UK government’s proposed regulatory architecture (a duty of care overseen by an independent regulator) has some promising potential. It could shift focus to regulating systems and instilling procedural accountability, rather than a focus on individual pieces of content and the liability/no liability binary. If implemented properly, its principles-based approach could allow for scalability, fine tailoring, and future-proofing; features that are presently lacking in the European content regulation paradigm (see for instance the EU Copyright directive and the EU Terrorist Content regulation).

Yet while the high-level architecture has promise, the UK government’s vision for how this new regulatory model could be realised in practice contains serious flaws. These must be addressed if this proposal is to reduce rather than encourage online harms.

  • Scope issues: The duty of care would apply to an extremely broad class of online companies. While this is not necessarily problematic if the approach is a principles-based one, there is a risk that smaller companies will be disproportionately burdened if the Codes of Practice are developed with only the tech incumbents in mind. In addition, the scope would include both hosting services and messaging services, despite the fact that they have radically different technical structures.
  • A conflation of terms: The duty of care would apply not only to a range of types of content – from illegal content like child abuse material to legal but harmful material like disinformation – but also harmful activities – from cyber bullying, to immigration crime, to ‘intimidation’. This conflation of content/activities and legal/harmful is concerning, given that many content-related ‘activities’ are almost impossible to proactively identify, and there is rarely a shared understanding of what ‘harmful’ means in different contexts.
  • The role of the independent regulator: Given that this regulator will have unprecedented power to determine how online content control works, it is worrying that the proposal doesn’t spell out safeguards that will be put in place to ensure its Codes of Practice are rights-protective and workable for different types of companies. In addition, it doesn’t give any clarity as to how the development of the codes will be truly co-regulatory.

Yet as we noted earlier, the UK government’s approach holds some promise, and many of the above issues could be addressed if the government is willing. There’s some crucial changes that we’ll be encouraging the UK government to adopt when it brings forward the relevant legislation. These relate to:

  • The legal status: There is a whole corpus of jurisprudence around duties of care and negligence law that has developed over centuries, therefore it is essential that the UK government clarifies how this proposal would interact with and relate to existing duties of care
  • The definitions: There needs to be much more clarity on what is meant by ‘harmful’ content. Similarly, there must be much more clarity on what is meant by ‘activities’ and the duty of care must acknowledge that each of these categories of ‘online harms’ requires a different approach.
  • The independent regulator: The governance structure must be truly co-regulatory, to ensure the measures are workable for companies and protective of individuals’ rights.

We look forward to engaging with the UK government as it enters into a consultation period on the new white paper. Our approach to the UK government will mirror the one that we are taking vis-à-vis the EU – that is, building out a vision for a new content regulation paradigm that addresses lawmakers’ legitimate concerns, but in a way which is rights and ecosystem protective. Stay tuned for our consultation response in late June.