Digital contact tracing apps have emerged in recent weeks as one potential tool in a suite of solutions that would allow countries around the world to respond to the COVID-19 pandemic and get people back to their daily lives. These apps raise a number of challenging privacy issues and have been subject to extensive technical analysis and argument. One important question that policymakers are grappling with is whether they should pursue more centralized designs that share contact information with a central authority, or decentralized ones that leave contact information on people’s devices and out of the reach of governments and companies.
Firefox Chief Technology Officer Eric Rescorla has an excellent overview of these competing design approaches, with their different potential risks and benefits. One critical insight he provides is that there is no Silicon Valley wizardry that will easily solve our problems. These different designs present us with different trade-offs and policy choices.
In this post, we want to provide a direct answer to one policy choice: Our view is that centralized designs present serious risk and should be disfavored. While decentralized systems present concerns of their own, their privacy properties are generally superior in situations where governments have chosen to deploy contact tracing apps.
Should your government have the social graph?
Centralized designs share data directly with public health professionals that may aid in their manual contact tracing efforts, providing a tool to identify and reach out to other potentially infected people. That is a key benefit identified by the designers of the BlueTrace system in use in Singapore. The biggest problem with this approach, as described recently by a number of leading technologists, is that it would expand government access to the “social graph” — data about you, your relationships, and your links with others.
The scope of this risk will depend on the details of specific proposals. Does the data include your location? Is it linked to phone numbers or emails? Is app usage voluntary or compulsory? A number of proposals only share your contact list when you are infected, and, if the infection rate is low, then access to the social graph will be more limited. But regardless of the particulars, we know this social graph data is near impossible to truly anonymize. It will provide information about you that is highly sensitive, and can easily be abused for a host of unintended purposes.
Social graph data could be used to see the contacts of political dissidents, for criminal investigations, or for immigration enforcement, to give just a few examples. This isn’t just about risk to personal privacy. Governments, in partnership with the private sector, could use this data to target or discriminate against particular segments of society.
Recently, many have pointed to well-established privacy principles as important tools that can mitigate privacy risk created by contact tracing apps. These include data minimization, rules governing data access and use, strict retention limits, and sunsetting of technical solutions when they are no longer needed. These are principles that Mozilla has long advocated for, and they may have important applications to contact tracing systems.
These protections are not strong enough, however, to prevent the potential abuse of data in centralized systems. Even minimized data is inherently sensitive because the government needs to know who tested positive, and who their contacts are. Recent history has shown that this kind of data, once collected, creates a tempting target for new uses — and for attackers if not kept securely. Neither governments nor the private sector have shown themselves up to the task of policing these new uses. The incentives to put data to unintended uses are simply too strong, so privacy principles don’t provide enough protection.
Moreover, as Mozilla Executive Director Mark Surman observes, the norms we establish today will live far beyond any particular app. This is an opportunity to establish the precedent that privacy is not optional. Centralized contact tracing apps threaten to do the opposite, normalizing systems to track citizens at scale. The technology we build today will likely live on. But even if it doesn’t, the decisions we make today will have repercussions beyond our current crisis and after we’ve sunset any particular app.
At Mozilla, we know about the pitfalls of expansive data collection. We are not experts in public health. In this moment of crisis, we need to take our cue from public health professionals about the problems they need to solve. But we also want policymakers, and the developers building these tools, to be mindful of the full costs of the solutions before them.
Trust is an essential part of helping people to take the steps needed to combat the pandemic. Centralized designs that provide contact information to central authorities are more likely to create privacy and security issues over time, and more likely to erode that trust. On balance we believe decentralized contact tracing apps, designed with privacy in mind, offer a better tool to solve real public health problems and establish a trusted relationship with the technology our lives may depend on.