Enabling Trust: The Difference Between Being Trusted and Trustworthy

We are getting ready for Data Privacy Day, and you are likely to hear the word “trust” come up often. When you give your data to a company, you are trusting that company to act responsibly with it. And because data powers so many of the products and services we use, that trust is critical to the modern internet. But there is also a negative side to the way we talk about trust and privacy. For me, when I hear the word trust, I think someone is trying to sell me something. I hear, “Give me your data and don’t ask too many questions.”

This gets to the difference between being a trusted company and being a trustworthy company. Many of the companies you engage with online ask for your trust without earning. You interact with them everyday but you can’t really tell whether they’ve got your back behind the scenes.  They don’t give you meaningful choices over your privacy. They are trusted but not necessarily trustworthy. At Mozilla we strive to be both. At Mozilla, every day is Data Privacy Day.

cyber superhero

Building trust with your users around their data doesn’t have to be complicated. But it does mean that you need to think about user privacy and security in every aspect of your product.

We do ask our users to give us their data. That data can help us improve Firefox and give us insight into the health of the internet in general. But we also encourage our users to ask questions, we give them tools to answer those questions, and we make it easy to turn off data collection if they don’t like the answers they find.

For example, a few month ago we launched our first Context Graph experiment, which collects data about how Firefox users browse the web. This can be some pretty sensitive stuff, which is why we asked our users to opt into the collect. The code for that experiment is publicly available, along with the practices that govern the data’s use and the code for analysis we conduct on the data. We’ve tried to put similar tools in place – clear notices, public code and public data documentation – for Test Pilot, our platform for testing and learning from experimental features in Firefox.

What these examples show is that our approach to privacy is actually rooted in our open source culture and commitment to transparency. Transparency is of course a big privacy buzzword, second only to trust. But at Mozilla it actually means something.

Our commitment to transparency is what allows our users to make informed choices about the data we collect. It is also what allows them to hold us accountable when we make mistakes. And to be clear, we do make mistakes. Privacy can be tricky and, despite efforts to the contrary, we do sometimes make the wrong decision about the best way to protect our users. When we make those mistakes, you will know and you will tell us. We think that is a good thing. It is what makes Mozilla more worthy of your trust.

Our responsibility as a technology company is to create secure platforms, build features that improve security, and empower people with education and resources to better protect their privacy and security. All of that starts with your ability to actually see and verify, through our code and our actions, that we’ve got your back.

One comment on “Enabling Trust: The Difference Between Being Trusted and Trustworthy”

  1. Daniel wrote on

    Thank you Mozilla!