We’re More Than The Sum Of Our Data

Alex Fowler

From the day I first browsed the Web, Mozilla has shaped my experience of the Internet. The community is one of the strongest forces making the Web what it is today. So I was intrigued when I was offered the chance to go from loyal user to paid contributor. The Web’s quick growth was creating new privacy concerns and Mozilla wanted to get in front of them. I had a successful consulting practice advising some of the biggest consumer brands on privacy and security, but I wanted to explore ways to have more impact.

What I found at Mozilla was truly inspiring. In the midst of massive investments in tracking and mining of user data, here was a group of people fiercely committed to making individual control part of the Web. Not since my time at the Electronic Frontier Foundation had I encountered an organization so well placed to reshape trust in the Internet. I was hooked.

That was three years ago, and I believe our work is more important than ever. According to leaked documents from Edward Snowden, governments see their ability to spy into our personal lives as the “price of admission” for use of an open Web. The same justification is given by industry lobbyists: that online tracking is the price for content and services. The powers-that-be believe we surrender the basic rights and freedoms we enjoy offline when we are online. And as someone who cares deeply about the Web, I take this personally.

A small group of people has decided that our privacy doesn’t matter. Privacy isn’t a philosophical abstraction. It’s what lets us control who we are through what we choose to reveal. It’s core to our autonomy, identity, and dignity as individuals. Privacy is what lets us trust that our laptops, phones, apps, and services are truly ours and not working against us. Abandoning privacy means accepting a Web where we are no longer informed participants.

At Mozilla, we believe privacy and security are fundamental and cannot be ignored. It’s enshrined in our Manifesto. However, we prefer to skip the platitudes, white papers, and insider deals; choosing, instead, to drive change through our code. Industry groups and policy makers had been debating Do Not Track for years before we showed up, wrote 30 lines of code, and gave it — for free — to hundreds of millions of Firefox users. Within a year, all of the other major browsers followed our lead. We saw the same thing happen when we killed the annoying pop-up ad. And we’re doing it again, together with members of our contributor community, testing new approaches to cookies, personalization and more.

In the wake of Snowden’s revelations and the work of countless journalists and advocates, we have a rare moment to change things for the better. Each week, front-page articles detail new intrusions into our private lives by governments and corporations around the world. 570,000 people signed a letter demanding our governments StopWatching.Us, which we delivered, in person, to politicians in Washington, DC. Over 50 million people have enabled Do Not Track,  sending trillions of anti-tracking signals across the Web each month and asking companies to respect their privacy. The world is being reminded of why privacy — why openness, transparency, and individual control — are fundamental not just to the Web, but to the future of our global, hyper-connected world.

I joined Mozilla because I found a community of people working to build the Web we need. If you believe that the Web and our privacy and security are worth fighting for, I ask you to support our work. Mozilla may compete in commercial markets, but we are proudly non-profit. Your personal contribution and those of other individual donors make it possible for us to stand up for users and our right to privacy. Click here to make a year-end donation to Mozilla — and help us build a Web that puts people before profits.

Alex Fowler
Chief Privacy Officer
Mozilla


This post launches the Mozilla end of year fundraising campaign. Over the balance of the year, you’ll hear personal stories from some of our leaders about why they joined Mozilla, the challenges that face the Web, and why your support matters. I’m pleased to have written the kick-off post and look forward to the discussion to come. — AF

Mozilla joins with Stanford and others to launch Cookie Clearinghouse

Alex Fowler

In a post this morning from Mozilla’s CTO Brendan Eich, we announced that we’re working with Stanford’s Center for Internet and Society to develop a Cookie Clearinghouse. The Cookie Clearinghouse will provide users of Firefox, Opera and other browsers an independent service to address privacy concerns related to third party cookies in a rational, trusted, transparent and consistent manner. The current third party cookie patch will require additional modifications over the course of several months, depending on how quickly the new service takes shape and comes online. Note there will be an open, public brown bag meeting on July 2nd where additional info will be presented.

Here’s what Brendan posted:

As you may recall from almost six weeks ago, we held the Safari-like third-party cookie patch, which blocks cookies set for domains you have not visited according to your browser’s cookie database, from progressing to Firefox Beta, because of two problems:

False positives. For example, say you visit a site named foo.com, which embeds cookie-setting content from a site named foocdn.com. With the patch, Firefox sets cookies from foo.com because you visited it, yet blocks cookies from foocdn.com because you never visited foocdn.com directly, even though there is actually just one company behind both sites.

False negatives. Meanwhile, in the other direction, just because you visit a site once does not mean you are ok with it tracking you all over the Internet on unrelated sites, forever more. Suppose you click on an ad by accident, for example. Or a site you trust directly starts setting third-party cookies you do not want.

Our challenge is to find a way to address these sorts of cases. We are looking for more granularity than deciding automatically and exclusively based upon whether you visit a site or not, although that is often a good place to start the decision process.

The logic driving us along the path to a better default third-party cookie policy looks like this:

  1. We want a third-party cookie policy that better protects privacy and encourages transparency.
  2. Naive visited-based blocking results in significant false negative and false positive errors.
  3. We need an exception management mechanism to refine the visited-based blocking verdicts.
  4. This exception mechanism cannot rely solely on the user in the loop, managing exceptions by hand. (When Safari users run into a false positive, they are advised to disable the block, and apparently many do so, permanently.)
  5. The only credible alternative is a centralized block-list (to cure false negatives) and allow-list (for false positives) service.

I’m very pleased that Aleecia McDonald of the Center for Internet and Society at Stanford has launched just such a list-based exception mechanism, the Cookie Clearinghouse (CCH).

Today Mozilla is committing to work with Aleecia and the CCH Advisory Board, whose members include Opera Software, to develop the CCH so that browsers can use its lists to manage exceptions to a visited-based third-party cookie block.

The CCH proposal is at an early stage, so we crave feedback. This means we will hold the visited-based cookie-blocking patch in Firefox Aurora while we bring up CCH and its Firefox integration, and test them.

Of course, browsers would cache the block- and allow-lists, just as we do for safe browsing. I won’t try to anticipate or restate details here, since we’re just starting. Please see the CCH site for the latest.

We are planning a public “brown bag” event for July 2nd at Mozilla to provide an update on where things stand and to gather feedback. I’ll update this post with details as they become available, but I wanted to share the date ASAP.

/be

Designing Meaningful Security and Privacy Experiences (Part II)

Larissa

[This is the second of a two-part post from Mozilla’s User Experience team on their look at privacy and security. You can view the first post here.]

Usability and security/privacy often seem to be at odds in the product creation process; designers are wary of these features because they fear interruptions to the user’s flow, while security/privacy advocates believe that the user isn’t safe when we oversimplify or strip down the protections and warnings they want to put in place.

Part of the tension stems from a shared assumption that our users don’t care about security or privacy. We can certainly marshal evidence to support this claim: for example, most users thoughtlessly click through alarming messages, use passwords that are insecure, and don’t hesitate to share personal information online. But, after various opportunities to engage with people through research and workshops, I believe that “user apathy” isn’t the conclusion we should draw from these behaviors.

The desire to feel/be safe is a fundamental quality of being human. But when it comes to technology, most people feel that they have so little control over their security and privacy, that, in the words of someone I interviewed, they ” just cross [their] fingers and hope nothing bad will happen.” New cyber-threats seem to emerge every day, each more ominous and abstract, until it becomes impossible for the average user to know how to reliably protect against them. Besides, people feel powerless in an ecosystem where companies routinely ask them to hand over their personal information in exchange for services. Maybe most importantly, most security and privacy choices that users are presented with are overwhelming and complex, dealing the final blow to a user’s sense of agency. (Additional insights from my Mozcamp Asia workshop.)

Mozcamp-Asia-Workshop

Participants at a security and privacy workshop at Mozcamp Singapore share “postcards” with Mozilla, telling us how we can help improve our user experience

Ultimately, I believe people need to two things to engage meaningfully with security and privacy; they must find trustworthy entities that help them feel safe online, and they must have true control over their choices.

To address these intertwined needs in our products, I came up with the following four imperatives — user experience requirements that must be met for a product to be successful:

  1. Earn and Keep My Trust
  2. Respect My Time and Task
  3. Help Me Make a Thoughtful Decision
  4. Offer Control Without Harming Me

(You can learn more about each of these imperatives from my brownbag.)

These imperatives are already shaping our design and user messaging in projects such as the Mixed Content Block and Click-to-Play Plugins (in a coming design). They’ve also helped me frame strategic discussions on various Firefox OS and Firefox features, such as App Permissions and Firefox Health Report. I hope they will continue to bridge the relationship between user experience and security/privacy, not only at Mozilla but in other organizations.

I started working on this framework for “meaningful security and privacy” to show that usability and security/privacy are necessary co-requisites to creating a good product.

When a product is truly secure, people have a better experience because they can use it confidently without fear or suspicion. When security choices are conveyed in a usable manner, people feel safer because they understand the consequences of their actions.

Security and privacy are deeply-held principles within Mozilla, and we often apply them from a policy or feature standpoint. I hope these design imperatives show that we can make an even greater impact on the Web by consciously incorporating them into our user experience.

This content reposted from the inaugural edition of the Mozilla UX Quarterly.

Exploring the Emotions of Security, Privacy and Identity (Part I)

Lindsay Kenzig

[This is the first of a two-part post from Mozilla’s User Experience Research team on their look at privacy and security.]

In-home Interview

One of 14 In-home Interviews for Project Hydra

Mozilla’s User Experience Research team recently connected with the Identity team for Project Hydra. Project Hydra is an exploratory research project in which we interviewed participants in their homes to better understand what Identity means to them – both “offline” and online, and how these concepts overlap.

For people like Sara,* online security and privacy are lingering worries she never actually gets around to doing something about:

“I really should start doing it [passwords] differently. It’s just the frustrating factor. If I start using random digits and numbers [all the time], then I’d have to totally rely on one spreadsheet… I try and keep everything in my head as much as I possibly can.

In fact, we learned through a series of interviews that for “mainstream” users like Sara, topics of privacy and security are uncomfortable and often actively ignored in daily life. In addition, users compartmentalize security and privacy, detaching them from how they view themselves and their activities online.

Users' View of IdentityThis has big implications for Mozilla. For one, security and privacy are not differentiators in most users’ minds because they are focused elsewhere. Users have strong opinions about wanting to be safe, but pointedly addressing the issue with them brings up strong negative emotions. As Ben Adida, Director of Identity at Mozilla, puts it, “Security is extremely important, but it is not the selling point.”

So how do we help users be safe but remain positive? Project Hydra addressed several ways we could do so after deep analysis and synthesis of the qualitative data.

  1. Online security is confusing (even among experts)! Start with baby steps instead of trying to tackle the entire problem at once. Fit good security and privacy practices in to users’ current tasks versus asking them to learn and negotiate complex technical jargon or alter the task they are on. Persona, Mozilla’s identity system for the web, is a good example of this.
  2. Users feel helpless. They feel security breaches are going to happen no matter what they do. If and when they decide to act, recognize the vulnerable emotions that come up when thinking about security and privacy. In the heat of the interaction, choosing user-centered language and design principles to reinforce Mozilla will help them to be safe (versus feeling even more confused or scared). Larissa Co on our User Experience team has some great examples of this with her work on “Meaningful Security and Privacy,” which will be explored in the second post.
  3. Security and privacy are abstract concepts. How do I know I am safe? Users cannot manage what they cannot see. Visualize and synthesize online behavior by developing systems that can analyze, connect, and anticipate activity. The possibility here is users can then holistically approach security and privacy needs versus via a piecemeal approach. Collusion is an example of how Mozilla is helping users to visualize their own behavior online.
  4. Cell phones make people feel particularly vulnerable. Users are more afraid and take more actions to physically protect their cell phones than other devices. Because people feel their mobile phone requires better security, focusing here may set expectations for higher security on other devices.

Mozilla’s Manifesto is being updated to say, “Individuals’ security and privacy on the Internet are fundamental and cannot be treated as optional.” To make this statement a reality and create outstanding products, it’s vital we understand security and privacy from an individual user’s perspective.

*named changed for confidentiality; participants consented to use of their words and photos

This content reposted from Mozilla’s UX Quarterly. More info on Project Hydra can be found on AirMozilla and our UX blog here and here.

Mozilla’s new Do Not Track dashboard: Firefox users continue to seek out and enable DNT

Alex Fowler

Mozilla is pleased to release a new interactive metrics page reporting monthly data on user adoption of Do Not Track (DNT) within Firefox. We’re making this data public both because it’s part of our mission and because we know there’s strong interest in the topic.

Currently, DNT adoption in the U.S. Firefox user base is approximately 17 percent. Globally, the average is around 11 percent. Based on these percentages, we estimate that our users send more than 135 million DNT signals every day — more than four trillion DNT signals every month.

The new page has interactive graphs that show the overall adoption curve for Firefox (desktop) and Firefox for Android (mobile), as well as two maps to provide a view into regional differences of adoption around the world.

Note that no Firefox user is tracked to generate data for these metrics. Every 24 hours, both Firefox and Firefox for Android automatically download the latest list of insecure add-ons and/or extensions to disable as part of our blocklist service. As a DNT signal is included in all requests made by the browser of a user who has turned DNT on, we can count the number of times we see the signal. No other information is logged on our servers. Anyone with a website and access to a web server can start counting how many users are sending DNT:1, which is how the signal is expressed via HTTP requests.

Data, Apps and Developers

Jishnu Menon

As Mozilla launches Firefox OS and the Firefox Marketplace, we’ve been focused on improving privacy through empowering app developers and users to improve transparency, choice and control, including implementing a tiered permissions model as well as tips for designing apps with privacy in mind. Over the next year, we’ll be rolling out more features and resources designed to make data practices more transparent for users and easier to indicate for developers.

Simultaneous to our own efforts, others have been innovating in the same space to try and push data transparency forward on mobile devices. We’re supportive of projects like these because they help drive the conversation forward and help to make privacy better across services and marketplaces:

Solutions that empower both developers and consumers are a critical part of making privacy better for users and the web and Mozilla looks forward to continuing our contributions to the growing number of initiatives around this issue.

Firefox getting smarter about third-party cookies

Alex Fowler

Mozilla has a long running interest in fostering greater transparency, trust and accountability related to privacy and the many cookie-based practices we see today.

fx nightly v22.0a1 privacy tabOn Friday, Mozilla released a Firefox patch into its “Nightly” channel that changes how cookies from third party companies function. Users of this build of Firefox must directly interact with a site or company for a cookie to be installed on their machine. The patch also provides an additional control setting under the “Privacy” tab in Firefox’s Preferences menu (see image).

Many years of observing Safari’s approach to third party cookies, a rapidly expanding number of third party companies using cookies to track users, and strong user support for more control is driving our decision to move forward with this patch.

We have a responsibility to advance features and controls that bring users’ expectations in line with how the web functions for them. As our General Counsel, Harvey Anderson, wrote a few weeks ago in a post about Mozilla’s recognition as the Most Trusted Internet Company for Privacy in 2012:

We all have to continue our efforts — both big and small — to create a more trustworthy environment of online products that seamlessly integrate ease of use, transparency, and user choice.

In my own use of this release this morning, I followed one of my typical browsing paths, starting with a look at surfing conditions, then local news, a major national news site, and a popular site covering the tech industry. (Incidentally, all the great coverage of our launch of Firefox OS at Mobile World Congress is really exciting!)

Here’s how the new patch changed the extent to which I was tracked:

Current Default:
Allow All Cookies
Proposed New Default:
Allow Cookies Only From Visited Domains
4 web sites used 8 first party domains 4 web sites used 8 first party domains
81 cookies from first party domains 75 cookies from first party domains
117 third party domains 0 third party domains
304 cookies from third party domains 0 cookies from third party domains
Total: 385 first & third party cookies Total: 75 first party cookies

 

I cleared all my cookies before visiting these sites and then re-performed this process several times, as I wanted to verify that in fact four sites did lead to over 300 cookies from more than 100 companies I had not visited. Display ads and sharing widgets on the sites worked fine, and as I clicked on them, the various parties involved were able to set cookies. The privacy policies on all four sites cover their cookie practices, including from third parties. Interestingly, they all pointed me to using settings in my browser to control the behavior of these cookies on their sites.

Mozilla is passionate about putting its users first and moving the web forward. That mission requires taking a leadership role on privacy, which we have repeatedly done (e.g., Do Not Track, Social API, Secure Search, Persona and Collusion).

Mozilla’s users frequently express concerns about web tracking, and we’ve been listening. We are constantly challenging ourselves to deliver a browser that conforms to user expectations while facilitating online innovation. We regularly review community and partner input, web standards, extensions, practices by other browsers, and much more. The new third party cookie patch in Firefox is just another example of those efforts.

The new default is currently only in this very early developer build of Firefox as it goes through Mozilla’s usual vetting process. As with other features we deploy, it will be several months of evaluating technical input from our users and the community before the new policy enters our Beta and General release versions of Firefox. The policy for how our current versions of Firefox handle cookies can be found here and here.

Mozilla loves to hear from our users about how it can make Firefox even better. We encourage all those interested to provide feedback via the mozilla.dev.privacy discussion group.

Cyber-security heating up on both sides of the Atlantic

handerson

In the US, another version of CISPA was reintroduced yesterday in the House of Representatives. The White House has also issued an executive order on the same topic. Similarly in Europe, the European Commission recently published two documents which articulate a strategy for cybersecurity – Cybersecurity Strategy of the European Union and the Proposed Directive on Network and Information Security. Info sharing programs to improve Internet security may be one of the most important global technology policy issues this year. We’re currently looking at these proposals to develop a view and understand if and how they may impact the Mozilla mission. If you would like to contribute to this effort, we welcome your participation.

On this side of the Atlantic, an editorial by CISPA bill author Rep. Dutch Ruppersberger articulates the rationale for the new CISPA bill. He likens it to a “911 line for cyber emergencies” so companies can call in threats and share supporting information when or before they occur.

The CISPA bill was problematic the first time it was introduced and later dropped last year, not because of the general goal to make critical infrastructure more secure, which is laudable, but because it compromised user privacy expectations. The new bill, among other provisions, provides for two way sharing of information from the government to commercial organizations and from commercial entities to the government to better defend against cyber-security attacks.

It seems the current bill has the same defects as last time as detailed by Mark Jaycox at EFF and Leslie Harris at Center for Democracy and Technology. Both organizations oppose the new bill  because it overwrites existing privacy laws and fosters non-transparent sharing of personal user information with US government agencies without controls. To encourage and facilitate this kind of sharing, it also provides civil immunity to private companies for such sharing. Citing recent attacks on The New York Times, The Wall Street Journal and the Federal Reserve, other organizations like CTIA, Verizon, and AT&T support the new CISPA bill. Civil advocates appear to support the White House executive order.

With the accumulation of digital user data and preferences held by service providers and the reality that increased cyber-attacks also jeopardize user privacy, it seems that the tensions between national security and human rights/civil liberties will again be tested. It’s also unclear that this kind of sharing will really make a difference, so it seems the technical community needs to weigh in further. My hope is that there’s a reasonable balance that doesn’t cost users too much in the way of privacy to achieve the stated security goals.

“do track or do not track?” — that is the question

Sid Stamm

5

For a while now, we’ve been talking about how the Do Not Track feature really has three states: “user says nothing”, “user says track”, and “user says don’t track”. In Firefox 4, we introduced two of these states with a checkbox (“user says nothing” and “user says don’t track”), and many people are voicing their desire to opt-out.

three-state DNT UI

Of course, it’s reasonable to expect some people want the tracking to improve the quality of ads they see; after all, the goal of this feature is to help each individual say what they want, whether it’s pro-tracking or not.

I just finished updating the Firefox tracking preference interface to give people the ability to say, “this tracking thing is fine, bring on the custom content!” This change is still experimental, but within a day or so it will be available in our Nightly builds for testing. Take a look, let us know what you think.

Search Suggestions for Firefox for Android: Another example of Mozilla’s approach to Privacy by Design

Alex Fowler

With the latest release of Firefox for Android, we’ve added the ability to get search suggestions from Google before you even finish typing. With the small screen and even smaller keyboard on users’ phones, anything that makes it easier to discover and access sites is a huge
improvement. The Awesome Bar already suggests bookmarks and recently visited sites. Now implementing Google’s search suggestions in Firefox for Android makes it that much easier to find sites users haven’t been to before.

Google can only make suggestions if it knows what you’re looking for. To do that, Firefox needs to send Google what you’re typing in the Awesome Bar as you type it. For many users, this makes sense. They know and trust Google and send completed search queries to Google anyway, so getting faster search suggestions is a welcome addition. 

Screenshot_searchprompt_3The first time you start typing in the Awesome Bar on your Android device, Firefox asks you whether you want search suggestions from Google. This is enabled by a prompt right where you’d look for the feature, so you can decide which experience you prefer and what you want to share. Of course, if you change your mind, you can always change your settings.

We added some other features to make search suggestions privacy-sensitive. For one thing, Firefox doesn’t ask Google for search suggestions if it looks like you’re typing a URL, like if you start with “www” or include a “:” or “/”. This means that even with search suggestions on, Google only gets asked about things you might actually want to search for, not every site you want to visit. We also make sure to get search suggestions (and searches themselves) over a secure HTTPS connection, so nobody else can view what you’re looking for.

You can try out the latest version for Firefox for Android with search suggestions now by going to the Google Play store on your computer or Android device.