Categories: Openness United States

Fighting Crime Shouldn’t Kill the Internet

The internet has long been a vehicle for creators and commerce. Yesterday, the Senate introduced a bill that would impose significant limitations on protections that have created vibrant online communities and content platforms, and allow users to create and consume uncurated material. While well intentioned, the liabilities placed on intermediaries in the bill would chill online speech and commerce. This is a counterproductive way to address sex trafficking, the ostensible purpose of the bill.

The internet, from its inception, started as a place to foster platforms and creators. In 1996 a law was passed that was intended to limit illegal content online – the Communications Decency Act (CDA). However, section 230 of the CDA provided protections for intermediaries: if you don’t know about particular illegal content, you aren’t held responsible for it. Intermediaries include platforms, websites, ISPs, and hosting providers, who as a result of CDA 230 are not held responsible for the actions of users. Section 230 is one of the reasons that YouTube, Facebook, Medium and online commenting systems can function without the technical burden or legal risk of screening every piece of user-generated content. Online platforms – love ‘em or hate ‘em – have enabled millions of less technical creators to share their work and opinions.

A fundamental part of the CDA is that it only punishes “knowing conduct” by intermediaries. This protection is missing from the changes this new bill proposes to CDA 230. The authors of the bill appear to be trying to preserve this core balance of – but they don’t add the “knowing conduct” language back into the CDA. Because they put it in the sex trafficking criminal statute instead, only Federal criminal cases would need to show that the site knew about the problematic content. The bill would introduce gaps in liability protections into CDA 230 that are not so easily covered. State laws can target intermediary behavior too, and without a “knowing conduct” standard in CDA directly, platforms of all types could be held liable for conduct of others that they know nothing about. This is also true of the (new) Federal civil right of action that this bill introduces. That means a small drafting choice strikes at the heart of the safe harbor provisions that make CDA 230 a powerful driver of the internet.

This bill is not well scoped to solve the problem, and does not impact the actual perpetrators of sex trafficking. Counterintuitively, it disincentivizes content moderation by removing the safe harbor around moderation (including automated moderation) that companies develop, including to detect illegal content like trafficking. And why would a company want to help law enforcement find the criminal content on their service when someone is going to turn around and sue them for having had it in the first place? Small and startup companies who are relying on the safe harbor to be innovative would face a greater risk environment for any user activity they facilitate. And users would have a much harder time finding places to do business, create, and speak.

The bill claims that CDA was never intended to protect websites that promote trafficking – but it was carefully tailored to ensure that intermediaries are not responsible for the conduct of their users. It has to work this way in order for the internet we know and love to exist. That doesn’t mean law enforcement can’t do its job – the CDA was built to provide ways to go after the bad guys (and to incentivize intermediaries to help). The proposed bill doesn’t do that.