Addressing gender-based online harms in the DSA

Last year the European Commission published the Digital Services Act (DSA) proposal, a draft law that seeks to set a new standard for platform accountability. We welcomed the draft law when it was published, and since then we have been working to ensure it is strengthened and elaborated as it proceeds through the mark-up stage. Today we’re confirming our support for a new initiative that focuses on improving the DSA with respect to gender-based online harm, an objective that aligns with our policy vision and the Mozilla Manifesto addendum.

An overarching focus of our efforts to improve the DSA have focused on the draft law’s risk assessment and auditing provisions. In order to structurally improve the health of the internet ecosystem, we need laws that compel platforms to meaningfully assess and mitigate the systemic risks stemming from the design and operation of their services. While the draft DSA is a good start, it falls short when it comes to specifying the types of systemic risks that platforms need to address.

One such area of systemic risk that warrants urgent attention is gender-based online harm. Women and non-binary people are subject to massive and persistent abuse online, with 74% of women reporting experiencing some form of online violence in the EU in 2020. Women from marginalised communities, including LGBTQ+ people, women of colour, and Black women in particular, are often disproportionately targeted with online abuse.

In our own platform accountability research this untenable reality has surfaced time and time again. For instance, in one testimony submitted to Mozilla Foundation as part of our YouTube Regrets campaign, one person wrote “In coming out to myself and close friends as transgender, my biggest regret was turning to YouTube to hear the stories of other trans and queer people. Simply typing in the word “transgender” brought up countless videos that were essentially describing my struggle as a mental illness and as something that shouldn’t exist. YouTube reminded me why I hid in the closet for so many years.”

Another story read: “I was watching a video game series on YouTube when all of a sudden I started getting all of these anti-women, incel and men’s rights recommended videos. I ended up removing that series from my watch history and going through and flagging those bad recommendations as ‘not interested’. It was gross and disturbing. That stuff is hate, and I really shouldn’t have to tell YouTube that it’s wrong to promote it.”

Indeed, further Mozilla research into this issue on YouTube has underscored the role of automated content recommender systems in exacerbating the problem, to the extent that they can recommend videos that violate the platform’s very own policies, like hate speech.

This is not only a problem on YouTube, but on the web at large. And while the DSA is not a silver bullet for addressing gender-based online harm, it can be an important part of the solution. To underscore that belief, we – as the Mozilla Foundation – have today signed on to a joint Call with stakeholders from across the digital rights, democracy, and womens’ rights communities. This Call aims to invigorate efforts to improve the DSA provisions around risk assessment and management, and ensure lawmakers appreciate the scale of gender-based online harm that communities face today.

This initiative complements other DSA-focused engagements that seek to address gender-based online harms. In July, we signaled our support for the The Who Writes the Rules campaign, and we stand in solidarity with the just-published testimonies of gender-based online abuse faced by the initiative’s instigators.

The DSA has been rightly-billed as an accountability game-changer. Lawmakers owe it to those who suffer gender-based online harm to ensure those systemic risks are properly accounted for.

The full text of the Call can be read here.