Categories: Safety Transparency

Mozilla suggests improvements to Canada’s online harms agenda

Later this year the Canadian government will publish new laws to overhaul how platforms in the country must tackle illegal and harmful content. The government’s desire to intervene is unsurprising – around the world, policymakers and the public are pressing for greater responsibility and accountability on the part of Big Tech. Yet in its proposal that platforms take down more content in ever-shorter periods of time, the government’s approach merely responds to symptoms and not structural causes of online harms. Worse still, the government’s proposal includes some suggested policy ideas that would have the opposite effect of making online spaces healthier and more inclusive. As we seek to advance a better vision for platform accountability across the world, we’re weighing-in here with some recommendations on how Canadian lawmakers can use this moment to meaningfully enhance responsibility while protecting rights and competition online.

As detailed in a white paper released in the summer, the government wants content-sharing platforms to monitor their services for certain forms of objectionable content and act on user reports within 24 hours. The new rules will apply to some categories of content that are already illegal under the criminal code (like child abuse material) as well as forms of content that, though not captured under the criminal code, are nonetheless considered harmful when tranmitted though online services (e.g. unconsensual nude imagery). A new regulator will police the rules, and companies will be subject to strict retention and reporting requirements.

We understand the desire to act against online harms, but the government’s suggested approach misses the mark. It focuses merely on symptoms of harmful online experiences, and not the structural factors that make those experiences harmful. The government’s approach is underpinned by the implicit belief that it is possible to sanitise the web of objectionable content – companies just need to take down more content, more quickly. Yet the reality is quite different. We know that harmful content experiences often owe themselves to how objectionable content is amplified, targeted, and presented to individuals online (e.g. through content recommender systems; ad microtargeting techniques).

The government’s apparent ‘zero-tolerance’ approach to objectionable content likewise manifests through the proposal that online services must report instances of ‘potentially criminal content’ to national security agencies. This attempt to responsibilise online services is deeply concerning. It will incentivise greater and more invasive monitoring of individuals by platforms (e.g. upload filtering; real-name policies) and have a disparate impact on those individuals and communities who already face structural oppression in the criminal justice system.

On the basis of the above, we believe that policymakers should pursue a systems-level approach to addressing content-related harms. Our vision sees policy as serving to incentivise greater responsibility from companies in how they design and operate their services, and helping to ensure that companies’ business practices do not inadvertently engender or exacerbate content-related harms. As we’ve engaged in these conversations around the world, we’ve built out a vision for what a systems-level approach to addressing online harms and improving platform accountability could look like. As a starting point, we have four recommendations for how Canadian lawmakers should design their upcoming policy intervention:

  • Asymmetry of obligations: The government should avoid one-size-fits-all approaches that put regressive and unnecessary compliance burdens on small and low-risk companies. Many of the most pressing policy issues pertain to types of companies that constitute a small subset of the market in terms of scale and business practices, and the rules should reflect that (e.g. stricter rules for companies of a certain size, scale, or target market).
  • A risk-based approach: Companies should be obligated or incentivised to undertake risk assessments that identify the ways in which their service’s design, operation, and misuse could engender or compound online harms. The rules should likewise oblige or incentivise companies to take steps to reduce the probability and severity of these risks, in a continuous cycle of procedural accountability.
    ___
  • Systemic transparency: Some of the most egregious harms in the online ecosystem remain hidden from view, simply because we do not have insight into how platforms shape online experiences. Transparency is a crucial prerequisite for accountability, and so Canadian lawmakers should implement a robust regime that allows regulators and public interest researchers to look under the hood of platforms (e.g. mandating that platforms disclose meaningful data concerning the ads that run on their services).
    ___
  • Polycentric oversight: Standing up a dedicated oversight body for content-sharing platforms makes sense in principle, but the devil is in the detail. It’s essential that oversight bodies are well-resourced, staffed with the appropriate technical expertise, and do not serve to undercut judicial processes or safeguards. We also think it’s important that oversight is polycentric (e.g. by incentivising third-party auditing; data access regimes for researchers) to avoid a single point of failure in the oversight function.

We believe an approach that is built around these features is more likely to achieve the government’s objectives, and ensure all Canadians can enjoy an internet experience defined by civil discourse, human dignity, and individual expression. The government is expected to continue consulting on these plans throughout the fall, and we encourage the policy community in Canada and elsewhere to feed into the discussions to help guide lawmakers towards progressive, thoughtful policy paths. For Mozilla this workstream raises issues that are core to our mission, and we’ll continue to follow the discussions closely.