Categories: privacy Uncategorized

Mozilla Submits Comments to NTIA on Privacy, Equity, and Civil Rights

This month, the National Telecommunications Information Administration (NTIA) requested comments on a vital topic – the intersection of privacy, equity, and civil rights in commercial data collection practices. In its role as principal advisor to the President on telecommunications and information policy issues, we applaud NTIA’s focus on privacy – and particularly on marginalized or underserved communities, who are especially at risk of privacy violations.

As builders of privacy and security-preserving technology, Mozilla shared our perspective, advocating for improved consumer technology and policies that advance choice and equity outcomes for consumers.

At Mozilla, privacy comes first in our products, like Enhanced Tracking Protection (ETP) and our end-to-end encrypted Firefox Sync service. Mozilla also prioritizes privacy in our public interest advocacy, calling for comprehensive privacy legislation, greater ad transparency, and robust enforcement of data privacy law and regulations around the globe. Through research such as the award-winning Internet Health Report and “Privacy Not Included” consumer guide, Mozilla serves as a global, open resource to empower consumers, inform policymakers, and inspire industry best practices.

Our comments to NTIA focus on areas that pose substantial risk to equity and civil rights in terms of data collection practices, advertising targeting, and the use of automated decision making systems (ADMS). Lastly, we conclude with several policy recommendations to guide more equitable technology policymaking.

Privacy and Data Collection Practices Online:

  • Privacy practices must strike the balance between delivering value in service and minimizing consumer data collection. In practice, this means being conscious to collect only the data that’s needed, not keep it longer than necessary, and clearly and concisely explain to consumers what data we collect, how to use it, and how we mitigate risks. From an equity perspective, it is important to note that without comprehensive federal privacy protections, many Americans are unprotected. Therefore, depending on where somebody lives, they may not have the right to access, delete, or even correct data – for example, even potentially false criminal records that data brokers share about users.

Equity, Ad targeting, and ADMS:

  • In a lot of ways, ad targeting systems drive today’s internet. This allows advertisers to target consumers based on their interests, behaviors and demographics. Systems also allow for easy segmentation of racial or demographic groups, with the potential for discriminatory outcomes and raising significant equity concerns. Automated-decision making systems (ADMS) share a similar set of equity risks, but across a broader set of use cases from facial recognition to automating decisions about mortgage rates or healthcare.
  • Despite challenges in studying the harms and risks associated with these practices and automated systems from the outside, companies that develop and implement these systems are in a position to prevent or mitigate harms like discrimination. Yet, the incentives for broad adoption of audits and mitigation tools are lacking. Thus, rules are necessary to change this incentive structure and make companies price in the externalities caused by the ADMS they develop.

Policy Recommendations:

  • Transparency is an important prerequisite to empowering consumers and diagnosing potential harm and risk associated with certain data collection practices. As a part of this, we also need to mandate greater access to platform data and protection for researchers at the federal level. To be able to address the harms associated with ADMS, policymakers must also compel transparency to understand: Are ADMS used for certain decisions? How do they make said decisions? In what contexts are they deployed?
  • Regulators must also take proactive steps in enforcing audits of ADMS that can surface and prevent potential harms to consumers, both before and after deployment in the market. This means making sure that systems are checked for bias, accuracy, privacy risks, and other harms.
  • Comprehensive privacy protection at the federal level and promoting Global Privacy Control (GPC) are other important protective measures.

To read Mozilla’s full submission, click here.