The future of ads and privacy
The modern web is funded by advertisements. Advertisements pay for all those “free” services you love, as well as many of the products you use on a daily basis — including Firefox. There’s nothing inherently wrong with advertising: Mozilla’s Principle #9 states that “Commercial involvement in the development of the internet brings many benefits.” However, that principle goes on to say that “a balance between commercial profit and public benefit is critical” and that’s where things have gone wrong: advertising on the web in many situations is powered by ubiquitous tracking of people’s activity on the web in a way that is deeply harmful to users and to the web as a whole.
Some Background
The ad tech ecosystem is incredibly complicated, but at its heart, the way that web advertising works is fairly simple. As you browse the web, trackers (mostly, but not exclusively advertisers), follow you around and build up a profile of your browsing history. Then, when you go to a site which wants to show you an ad, that browsing history is used to decide which of the potential ads you might see you actually get shown.
The visible part of web tracking is creepy enough — why are those pants I looked at last week following me around the Internet? — but the invisible part is even worse: hundreds of companies you’ve never heard of follow you around as you browse and then use your data for their own purposes or sell it to other companies you’ve also never heard of.
The primary technical mechanism used by trackers is what’s called “third party cookies”. A good description of third party cookies can be found here, a cookie is a piece of data that a website stores on your browser and can retrieve later. A third party cookie is a cookie which is set by someone other than the page you’re visiting (typically a tracker). The tracker works with the web site to embed some code from the tracker on their page (often this code is also responsible for showing ads) and that code sets a cookie for the tracker. Every time you go to a page the tracker is embedded on, it sees the same cookie and can use that to link up all the sites you go to.
Cookies themselves are an important part of the web — they’re what let you log into sites, maintain your shopping carts, etc. However, third party cookies are used in a way that the designers of the web didn’t really intend and unfortunately, they’re now ubiquitous. While they have some legitimate uses, like federated login, they are mostly used for tracking user behavior.
Obviously, this is bad and it shouldn’t be a surprise to anybody who has followed our work in Firefox that we believe this needs to change. We’ve been working for years to drive the industry in a better direction. In 2015 we launched Tracking Protection, our first major step towards blocking tracking in the browser. In 2019 we turned on a newer version of our anti-tracking technology by default for all of our users. And we’re not the only ones doing this.
We believe all browsers should protect their users from tracking, particularly cookie-based tracking, and should be moving expeditiously to do so.
Privacy Preserving Advertising
Although third-party cookies are bad news, now that they are so baked into the web, it won’t be easy to get rid of them. Because they’re a dual-use technology with some legitimate applications, just turning them off (or doing something more sophisticated like Firefox Total Cookie Protection) can cause some web sites to break for users. Moreover, we have to be constantly on guard against new tracking techniques.
One idea that has gotten a lot of attention recently is what’s called “Privacy Preserving Advertising” (PPA) . The basic idea has a long history with systems such as Adnostic, PrivAd, and AdScale but has lately been reborn with proposals from Google, Microsoft, Apple, and Criteo, among others. The details are of course fairly complicated, but the general idea is straightforward: identify the legitimate (i.e., non-harmful) applications for tracking techniques and build alternative technical mechanisms for those applications without threatening user privacy. Once we have done that, it becomes much more practical to strictly limit the use of third party cookies.
This is a generally good paradigm: technology has advanced a lot since cookies were invented in the 1990s and it’s now possible to do many things privately that used to require just collecting user data. But, of course, it’s also possible to use technology to do things that aren’t so good (which is how we got into this hole in the first place). When looking at a set of technologies like PPA, we need to ask:
- Are the use cases for the technology actually good for users and for the web?
- Do these technologies improve user privacy and security? Are they collecting the minimal amount of data that is necessary to accomplish the task?
- Are these technologies being developed in an open standards process with input from all stakeholders?
Because this isn’t just one technology but rather a set of them, we should expect some pieces to be better than others. In particular, ad measurement is a use case that is important to the ecosystem, and we think that getting this one component right can drive value for consumers and engage advertising stakeholders. There’s overlap here with technologies like Prio which we already use in Firefox. On the other hand, we’re less certain about a number of the proposed technologies for user targeting, which have privacy properties that seem hard to analyze. This is a whole new area of technology, so we should expect it to be hard, but that’s also a reason to make sure we get it right.
What’s next?
Obviously, this is just the barest overview. In upcoming posts we’ll provide a more detailed survey of the space, covering the existing situation in more detail, some of the proposals on offer, and where we think the big opportunities are to improve things in both the technical and policy domains.
For more on this:
Building a more privacy preserving ads-based ecosystem
Mozilla responds to the UK CMA consultation on google’s commitments on the Chrome Privacy Sandbox