Response to President Obama’s Speech on Surveillance

Alex Fowler

Expansive government surveillance practices have severely damaged the health of the open Internet, creating calls for change from diverse organizations around the world along with hundreds of thousands of Internet users. President Obama’s speech on surveillance reform provided the first clear signs of the Administration’s response.

Overall, the strategy seems to be to leave current intelligence processes largely intact and improve oversight to a degree. We’d hoped for, and the Internet deserves, more. Without a meaningful change of course, the Internet will continue on its path toward a world of balkanization and distrust, a grave departure from its origins of openness and opportunity.

From our perspective as both an Internet company and a global community of users and developers, we’re concerned that the President didn’t address the most glaring reform needs. The President’s Review Board made 46 recommendations for surveillance reform, and some of the most important pieces are being ignored or punted to further review.

The Administration missed a compelling opportunity to:

  • Endorse legislative reform to limit surveillance, such as the USA FREEDOM Act and ECPA reform efforts;
  • Propose reforms on encouraging, promoting, or supporting backdoors;
  • End efforts to undermine security standards and protocols; and
  • Adequately protect the privacy rights of foreign citizens with no connection to intelligence, military, or terrorist activity.

The speech also didn’t raise one of the most important issues determining the future of government surveillance and privacy: the priorities of the next director of the NSA. If a culture of unlimited data gathering above all else persists, legal reforms and improved technological protections will be watered down over time and will never be enough to restore trust to the Internet. Internet users around the world would be well served if the next director of the NSA makes transparency and human rights a true priority. In Benjamin Franklin’s oft-quoted words, “They who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”

The President’s speech did include one important reform that supports a healthy, trustable Internet: creating a new public advocate for privacy within the specialized intelligence court, FISA. Such an oppositional element is essential to ensure meaningful rule of law over government surveillance practices, in any context. In the U.S., where 99% of FISA court decisions ultimately favor the government, it seems particularly overdue.

Some of the Administration’s other ideas carry mixed benefits and harms for the future of the Open Internet. Limiting the scale of some bulk collection programs helps, to a small degree. But it does not justify the continuation of practices that significantly undermine privacy. The plan to work with Congress on alternative ways to sustain bulk collection through third parties or alternative storage may similarly create more harm. Third-party storage could allow for an additional layer of legal process and increase the practical cost of using the data, creating some safety measures and incentives against abuse. But those third parties might store it insecurely or unreliably, posing significant risk for both the intelligence mission and the communication subjects’ privacy.

At Mozilla, we’ve worked to protect privacy and trust online through many angles:

We’re going to keep working on this, pushing for meaningful change to surveillance practices and security technologies to help restore trust and support the open Internet around the world. We expect the President’s speech to be a floor for reform, not a ceiling, and we will make our positions known to Congress and the Administration. But we’ll need your help. For starters, you can join the movement at StopWatching.Us — and keep watching this page for more opportunities to make your voice heard.

Alex Fowler, Global Privacy & Policy Leader
Chris Riley, Senior Policy Engineer

Trust but Verify: Repost of article on security value of open source software

Alex Fowler

Over the weekend, my colleague Andreas Gal, together with Mozilla’s CTO Brendan Eich, published an article on the importance of open source software for maintaining the public’s trust that our products aren’t secretly working against the interests of our users.

In an effort to bring together posts related to privacy at Mozilla into one place, I’m republishing the post below.

Trust but Verify

Background

It is becoming increasingly difficult to trust the privacy properties of software and services we rely on to use the Internet. Governments, companies, groups and individuals may be surveilling us without our knowledge. This is particularly troubling when such surveillance is done by governments under statutes that provide limited court oversight and almost no room for public scrutiny.

As a result of laws in the US and elsewhere, prudent users must interact with Internet services knowing that despite how much any cloud-service company wants to protect privacy, at the end of the day most big companies must comply with the law. The government can legally access user data in ways that might violate the privacy expectations of law-abiding users. Worse, the government may force service operators to enable surveillance (something that seems to have happened in the Lavabit case).

Worst of all, the government can do all of this without users ever finding out about it, due to gag orders.

Implications for Browsers

This creates a significant predicament for privacy and security on the Open Web. Every major browser today is distributed by an organization within reach of surveillance laws. As the Lavabit case suggests, the government may request that browser vendors secretly inject surveillance code into the browsers they distribute to users. We have no information that any browser vendor has ever received such a directive. However, if that were to happen, the public would likely not find out due to gag orders.

The unfortunate consequence is that software vendors — including browser vendors — must not be blindly trusted. Not because such vendors don’t want to protect user privacy. Rather, because a law might force vendors to secretly violate their own principles and do things they don’t want to do.

Why Mozilla is different

Mozilla has one critical advantage over all other browser vendors. Our products are truly open source. Internet Explorer is fully closed-source, and while the rendering engines WebKit and Blink (chromium) are open-source, the Safari and Chrome browsers that use them are not fully open source. Both contain significant fractions of closed-source code.

Mozilla Firefox in contrast is 100% open source [1]. As Anthony Jones from our New Zealand office pointed out the other month, security researchers can use this fact to verify the executable bits contained in the browsers Mozilla is distributing, by building Firefox from source and comparing the built bits with our official distribution.

This will be the most effective on platforms where we already use open-source compilers to produce the executable, to avoid compiler-level attacks as shown in 1984 by Ken Thompson.

Call to Action

To ensure that no one can inject undetected surveillance code into Firefox, security researchers and organizations should:

  • regularly audit Mozilla source and verified builds by all effective means;
  • establish automated systems to verify official Mozilla builds from source; and
  • raise an alert if the verified bits differ from official bits.

In the best case, we will establish such a verification system at a global scale, with participants from many different geographic regions and political and strategic interests and affiliations.

Security is never “done” — it is a process, not a final rest-state. No silver bullets. All methods have limits. However, open-source auditability cleanly beats the lack of ability to audit source vs. binary.

Through international collaboration of independent entities we can give users the confidence that Firefox cannot be subverted without the world noticing, and offer a browser that verifiably meets users’ privacy expectations.

See bug 885777 to track our work on verifiable builds.

End-to-End Trust

Beyond this first step, can we use such audited browsers as trust anchors, to authenticate fully-audited open-source Internet services? This seems possible in theory. No one has built such a system to our knowledge, but we welcome precedent citations and experience reports, and encourage researchers to collaborate with us.

Brendan Eich, CTO and SVP Engineering, Mozilla
Andreas Gal, VP Mobile and R&D, Mozilla

[1] Firefox on Linux is the best case, because the C/C++ compiler, runtime libraries, and OS kernel are all free and open source software. Note that even on Linux, certain hardware-vendor-supplied system software, e.g., OpenGL drivers, may be closed source.

Nationwide Day of Action for Online Privacy

Chris Riley

ecpaMozilla stands with hundreds of major technology companies and nonprofit organizations, and tens of thousands of digital advocates, in calling for reform of the Electronic Communications Privacy Act, or ECPA.

ECPA was enacted in 1986 to ensure that wiretaps of then “new” forms of electronic communications (e.g., email messages between computers) were limited in the same way that telephone wiretaps were. The safeguards have been watered down over the intervening years, and were not extended to data stored on a computer. The result is that emails, social media messages, and other communications that users may consider private are not uniformly treated as such under United States law.

Today, Congress, with bipartisan support, has proposed changes to ECPA. Yet many government agencies would prefer to see these reform efforts die. Internet users who value privacy and trust online must make their voices heard.

Mozilla supports efforts for positive change in this space – and we’d like your help. We’re asking you to join us by signing a White House petition asking for support for sensible ECPA reform. These changes alone won’t eliminate the harms, but, like the USA FREEDOM Act, they will make a positive contribution to that effort.

We’re More Than The Sum Of Our Data

Alex Fowler

From the day I first browsed the Web, Mozilla has shaped my experience of the Internet. The community is one of the strongest forces making the Web what it is today. So I was intrigued when I was offered the chance to go from loyal user to paid contributor. The Web’s quick growth was creating new privacy concerns and Mozilla wanted to get in front of them. I had a successful consulting practice advising some of the biggest consumer brands on privacy and security, but I wanted to explore ways to have more impact.

What I found at Mozilla was truly inspiring. In the midst of massive investments in tracking and mining of user data, here was a group of people fiercely committed to making individual control part of the Web. Not since my time at the Electronic Frontier Foundation had I encountered an organization so well placed to reshape trust in the Internet. I was hooked.

That was three years ago, and I believe our work is more important than ever. According to leaked documents from Edward Snowden, governments see their ability to spy into our personal lives as the “price of admission” for use of an open Web. The same justification is given by industry lobbyists: that online tracking is the price for content and services. The powers-that-be believe we surrender the basic rights and freedoms we enjoy offline when we are online. And as someone who cares deeply about the Web, I take this personally.

A small group of people has decided that our privacy doesn’t matter. Privacy isn’t a philosophical abstraction. It’s what lets us control who we are through what we choose to reveal. It’s core to our autonomy, identity, and dignity as individuals. Privacy is what lets us trust that our laptops, phones, apps, and services are truly ours and not working against us. Abandoning privacy means accepting a Web where we are no longer informed participants.

At Mozilla, we believe privacy and security are fundamental and cannot be ignored. It’s enshrined in our Manifesto. However, we prefer to skip the platitudes, white papers, and insider deals; choosing, instead, to drive change through our code. Industry groups and policy makers had been debating Do Not Track for years before we showed up, wrote 30 lines of code, and gave it — for free — to hundreds of millions of Firefox users. Within a year, all of the other major browsers followed our lead. We saw the same thing happen when we killed the annoying pop-up ad. And we’re doing it again, together with members of our contributor community, testing new approaches to cookies, personalization and more.

In the wake of Snowden’s revelations and the work of countless journalists and advocates, we have a rare moment to change things for the better. Each week, front-page articles detail new intrusions into our private lives by governments and corporations around the world. 570,000 people signed a letter demanding our governments StopWatching.Us, which we delivered, in person, to politicians in Washington, DC. Over 50 million people have enabled Do Not Track,  sending trillions of anti-tracking signals across the Web each month and asking companies to respect their privacy. The world is being reminded of why privacy — why openness, transparency, and individual control — are fundamental not just to the Web, but to the future of our global, hyper-connected world.

I joined Mozilla because I found a community of people working to build the Web we need. If you believe that the Web and our privacy and security are worth fighting for, I ask you to support our work. Mozilla may compete in commercial markets, but we are proudly non-profit. Your personal contribution and those of other individual donors make it possible for us to stand up for users and our right to privacy. Click here to make a year-end donation to Mozilla — and help us build a Web that puts people before profits.

Alex Fowler
Chief Privacy Officer
Mozilla


This post launches the Mozilla end of year fundraising campaign. Over the balance of the year, you’ll hear personal stories from some of our leaders about why they joined Mozilla, the challenges that face the Web, and why your support matters. I’m pleased to have written the kick-off post and look forward to the discussion to come. — AF

Mozilla joins with Stanford and others to launch Cookie Clearinghouse

Alex Fowler

In a post this morning from Mozilla’s CTO Brendan Eich, we announced that we’re working with Stanford’s Center for Internet and Society to develop a Cookie Clearinghouse. The Cookie Clearinghouse will provide users of Firefox, Opera and other browsers an independent service to address privacy concerns related to third party cookies in a rational, trusted, transparent and consistent manner. The current third party cookie patch will require additional modifications over the course of several months, depending on how quickly the new service takes shape and comes online. Note there will be an open, public brown bag meeting on July 2nd where additional info will be presented.

Here’s what Brendan posted:

As you may recall from almost six weeks ago, we held the Safari-like third-party cookie patch, which blocks cookies set for domains you have not visited according to your browser’s cookie database, from progressing to Firefox Beta, because of two problems:

False positives. For example, say you visit a site named foo.com, which embeds cookie-setting content from a site named foocdn.com. With the patch, Firefox sets cookies from foo.com because you visited it, yet blocks cookies from foocdn.com because you never visited foocdn.com directly, even though there is actually just one company behind both sites.

False negatives. Meanwhile, in the other direction, just because you visit a site once does not mean you are ok with it tracking you all over the Internet on unrelated sites, forever more. Suppose you click on an ad by accident, for example. Or a site you trust directly starts setting third-party cookies you do not want.

Our challenge is to find a way to address these sorts of cases. We are looking for more granularity than deciding automatically and exclusively based upon whether you visit a site or not, although that is often a good place to start the decision process.

The logic driving us along the path to a better default third-party cookie policy looks like this:

  1. We want a third-party cookie policy that better protects privacy and encourages transparency.
  2. Naive visited-based blocking results in significant false negative and false positive errors.
  3. We need an exception management mechanism to refine the visited-based blocking verdicts.
  4. This exception mechanism cannot rely solely on the user in the loop, managing exceptions by hand. (When Safari users run into a false positive, they are advised to disable the block, and apparently many do so, permanently.)
  5. The only credible alternative is a centralized block-list (to cure false negatives) and allow-list (for false positives) service.

I’m very pleased that Aleecia McDonald of the Center for Internet and Society at Stanford has launched just such a list-based exception mechanism, the Cookie Clearinghouse (CCH).

Today Mozilla is committing to work with Aleecia and the CCH Advisory Board, whose members include Opera Software, to develop the CCH so that browsers can use its lists to manage exceptions to a visited-based third-party cookie block.

The CCH proposal is at an early stage, so we crave feedback. This means we will hold the visited-based cookie-blocking patch in Firefox Aurora while we bring up CCH and its Firefox integration, and test them.

Of course, browsers would cache the block- and allow-lists, just as we do for safe browsing. I won’t try to anticipate or restate details here, since we’re just starting. Please see the CCH site for the latest.

We are planning a public “brown bag” event for July 2nd at Mozilla to provide an update on where things stand and to gather feedback. I’ll update this post with details as they become available, but I wanted to share the date ASAP.

/be

Designing Meaningful Security and Privacy Experiences (Part II)

Larissa

[This is the second of a two-part post from Mozilla’s User Experience team on their look at privacy and security. You can view the first post here.]

Usability and security/privacy often seem to be at odds in the product creation process; designers are wary of these features because they fear interruptions to the user’s flow, while security/privacy advocates believe that the user isn’t safe when we oversimplify or strip down the protections and warnings they want to put in place.

Part of the tension stems from a shared assumption that our users don’t care about security or privacy. We can certainly marshal evidence to support this claim: for example, most users thoughtlessly click through alarming messages, use passwords that are insecure, and don’t hesitate to share personal information online. But, after various opportunities to engage with people through research and workshops, I believe that “user apathy” isn’t the conclusion we should draw from these behaviors.

The desire to feel/be safe is a fundamental quality of being human. But when it comes to technology, most people feel that they have so little control over their security and privacy, that, in the words of someone I interviewed, they ” just cross [their] fingers and hope nothing bad will happen.” New cyber-threats seem to emerge every day, each more ominous and abstract, until it becomes impossible for the average user to know how to reliably protect against them. Besides, people feel powerless in an ecosystem where companies routinely ask them to hand over their personal information in exchange for services. Maybe most importantly, most security and privacy choices that users are presented with are overwhelming and complex, dealing the final blow to a user’s sense of agency. (Additional insights from my Mozcamp Asia workshop.)

Mozcamp-Asia-Workshop

Participants at a security and privacy workshop at Mozcamp Singapore share “postcards” with Mozilla, telling us how we can help improve our user experience

Ultimately, I believe people need to two things to engage meaningfully with security and privacy; they must find trustworthy entities that help them feel safe online, and they must have true control over their choices.

To address these intertwined needs in our products, I came up with the following four imperatives — user experience requirements that must be met for a product to be successful:

  1. Earn and Keep My Trust
  2. Respect My Time and Task
  3. Help Me Make a Thoughtful Decision
  4. Offer Control Without Harming Me

(You can learn more about each of these imperatives from my brownbag.)

These imperatives are already shaping our design and user messaging in projects such as the Mixed Content Block and Click-to-Play Plugins (in a coming design). They’ve also helped me frame strategic discussions on various Firefox OS and Firefox features, such as App Permissions and Firefox Health Report. I hope they will continue to bridge the relationship between user experience and security/privacy, not only at Mozilla but in other organizations.

I started working on this framework for “meaningful security and privacy” to show that usability and security/privacy are necessary co-requisites to creating a good product.

When a product is truly secure, people have a better experience because they can use it confidently without fear or suspicion. When security choices are conveyed in a usable manner, people feel safer because they understand the consequences of their actions.

Security and privacy are deeply-held principles within Mozilla, and we often apply them from a policy or feature standpoint. I hope these design imperatives show that we can make an even greater impact on the Web by consciously incorporating them into our user experience.

This content reposted from the inaugural edition of the Mozilla UX Quarterly.

Exploring the Emotions of Security, Privacy and Identity (Part I)

Lindsay Kenzig

[This is the first of a two-part post from Mozilla’s User Experience Research team on their look at privacy and security.]

In-home Interview

One of 14 In-home Interviews for Project Hydra

Mozilla’s User Experience Research team recently connected with the Identity team for Project Hydra. Project Hydra is an exploratory research project in which we interviewed participants in their homes to better understand what Identity means to them – both “offline” and online, and how these concepts overlap.

For people like Sara,* online security and privacy are lingering worries she never actually gets around to doing something about:

“I really should start doing it [passwords] differently. It’s just the frustrating factor. If I start using random digits and numbers [all the time], then I’d have to totally rely on one spreadsheet… I try and keep everything in my head as much as I possibly can.

In fact, we learned through a series of interviews that for “mainstream” users like Sara, topics of privacy and security are uncomfortable and often actively ignored in daily life. In addition, users compartmentalize security and privacy, detaching them from how they view themselves and their activities online.

Users' View of IdentityThis has big implications for Mozilla. For one, security and privacy are not differentiators in most users’ minds because they are focused elsewhere. Users have strong opinions about wanting to be safe, but pointedly addressing the issue with them brings up strong negative emotions. As Ben Adida, Director of Identity at Mozilla, puts it, “Security is extremely important, but it is not the selling point.”

So how do we help users be safe but remain positive? Project Hydra addressed several ways we could do so after deep analysis and synthesis of the qualitative data.

  1. Online security is confusing (even among experts)! Start with baby steps instead of trying to tackle the entire problem at once. Fit good security and privacy practices in to users’ current tasks versus asking them to learn and negotiate complex technical jargon or alter the task they are on. Persona, Mozilla’s identity system for the web, is a good example of this.
  2. Users feel helpless. They feel security breaches are going to happen no matter what they do. If and when they decide to act, recognize the vulnerable emotions that come up when thinking about security and privacy. In the heat of the interaction, choosing user-centered language and design principles to reinforce Mozilla will help them to be safe (versus feeling even more confused or scared). Larissa Co on our User Experience team has some great examples of this with her work on “Meaningful Security and Privacy,” which will be explored in the second post.
  3. Security and privacy are abstract concepts. How do I know I am safe? Users cannot manage what they cannot see. Visualize and synthesize online behavior by developing systems that can analyze, connect, and anticipate activity. The possibility here is users can then holistically approach security and privacy needs versus via a piecemeal approach. Collusion is an example of how Mozilla is helping users to visualize their own behavior online.
  4. Cell phones make people feel particularly vulnerable. Users are more afraid and take more actions to physically protect their cell phones than other devices. Because people feel their mobile phone requires better security, focusing here may set expectations for higher security on other devices.

Mozilla’s Manifesto is being updated to say, “Individuals’ security and privacy on the Internet are fundamental and cannot be treated as optional.” To make this statement a reality and create outstanding products, it’s vital we understand security and privacy from an individual user’s perspective.

*named changed for confidentiality; participants consented to use of their words and photos

This content reposted from Mozilla’s UX Quarterly. More info on Project Hydra can be found on AirMozilla and our UX blog here and here.

Mozilla’s new Do Not Track dashboard: Firefox users continue to seek out and enable DNT

Alex Fowler

Mozilla is pleased to release a new interactive metrics page reporting monthly data on user adoption of Do Not Track (DNT) within Firefox. We’re making this data public both because it’s part of our mission and because we know there’s strong interest in the topic.

Currently, DNT adoption in the U.S. Firefox user base is approximately 17 percent. Globally, the average is around 11 percent. Based on these percentages, we estimate that our users send more than 135 million DNT signals every day — more than four trillion DNT signals every month.

The new page has interactive graphs that show the overall adoption curve for Firefox (desktop) and Firefox for Android (mobile), as well as two maps to provide a view into regional differences of adoption around the world.

Note that no Firefox user is tracked to generate data for these metrics. Every 24 hours, both Firefox and Firefox for Android automatically download the latest list of insecure add-ons and/or extensions to disable as part of our blocklist service. As a DNT signal is included in all requests made by the browser of a user who has turned DNT on, we can count the number of times we see the signal. No other information is logged on our servers. Anyone with a website and access to a web server can start counting how many users are sending DNT:1, which is how the signal is expressed via HTTP requests.

Data, Apps and Developers

Jishnu Menon

As Mozilla launches Firefox OS and the Firefox Marketplace, we’ve been focused on improving privacy through empowering app developers and users to improve transparency, choice and control, including implementing a tiered permissions model as well as tips for designing apps with privacy in mind. Over the next year, we’ll be rolling out more features and resources designed to make data practices more transparent for users and easier to indicate for developers.

Simultaneous to our own efforts, others have been innovating in the same space to try and push data transparency forward on mobile devices. We’re supportive of projects like these because they help drive the conversation forward and help to make privacy better across services and marketplaces:

Solutions that empower both developers and consumers are a critical part of making privacy better for users and the web and Mozilla looks forward to continuing our contributions to the growing number of initiatives around this issue.

Firefox getting smarter about third-party cookies

Alex Fowler

Mozilla has a long running interest in fostering greater transparency, trust and accountability related to privacy and the many cookie-based practices we see today.

fx nightly v22.0a1 privacy tabOn Friday, Mozilla released a Firefox patch into its “Nightly” channel that changes how cookies from third party companies function. Users of this build of Firefox must directly interact with a site or company for a cookie to be installed on their machine. The patch also provides an additional control setting under the “Privacy” tab in Firefox’s Preferences menu (see image).

Many years of observing Safari’s approach to third party cookies, a rapidly expanding number of third party companies using cookies to track users, and strong user support for more control is driving our decision to move forward with this patch.

We have a responsibility to advance features and controls that bring users’ expectations in line with how the web functions for them. As our General Counsel, Harvey Anderson, wrote a few weeks ago in a post about Mozilla’s recognition as the Most Trusted Internet Company for Privacy in 2012:

We all have to continue our efforts — both big and small — to create a more trustworthy environment of online products that seamlessly integrate ease of use, transparency, and user choice.

In my own use of this release this morning, I followed one of my typical browsing paths, starting with a look at surfing conditions, then local news, a major national news site, and a popular site covering the tech industry. (Incidentally, all the great coverage of our launch of Firefox OS at Mobile World Congress is really exciting!)

Here’s how the new patch changed the extent to which I was tracked:

Current Default:
Allow All Cookies
Proposed New Default:
Allow Cookies Only From Visited Domains
4 web sites used 8 first party domains 4 web sites used 8 first party domains
81 cookies from first party domains 75 cookies from first party domains
117 third party domains 0 third party domains
304 cookies from third party domains 0 cookies from third party domains
Total: 385 first & third party cookies Total: 75 first party cookies

 

I cleared all my cookies before visiting these sites and then re-performed this process several times, as I wanted to verify that in fact four sites did lead to over 300 cookies from more than 100 companies I had not visited. Display ads and sharing widgets on the sites worked fine, and as I clicked on them, the various parties involved were able to set cookies. The privacy policies on all four sites cover their cookie practices, including from third parties. Interestingly, they all pointed me to using settings in my browser to control the behavior of these cookies on their sites.

Mozilla is passionate about putting its users first and moving the web forward. That mission requires taking a leadership role on privacy, which we have repeatedly done (e.g., Do Not Track, Social API, Secure Search, Persona and Collusion).

Mozilla’s users frequently express concerns about web tracking, and we’ve been listening. We are constantly challenging ourselves to deliver a browser that conforms to user expectations while facilitating online innovation. We regularly review community and partner input, web standards, extensions, practices by other browsers, and much more. The new third party cookie patch in Firefox is just another example of those efforts.

The new default is currently only in this very early developer build of Firefox as it goes through Mozilla’s usual vetting process. As with other features we deploy, it will be several months of evaluating technical input from our users and the community before the new policy enters our Beta and General release versions of Firefox. The policy for how our current versions of Firefox handle cookies can be found here and here.

Mozilla loves to hear from our users about how it can make Firefox even better. We encourage all those interested to provide feedback via the mozilla.dev.privacy discussion group.