Categories: Announcements Privacy

Plugging the CSS History Leak

Privacy isn’t always easy.

We’re close to landing some changes in the Firefox development tree that will fix a privacy leak that browsers have been struggling with for some time. We’re really excited about this fix, we hope other browsers will follow suit. It’s a tough problem to fix, though, so I’d like to describe how we ended up with this approach.

History Sniffing

Visited and Unvisited LinksLinks can look different on web sites based on whether or not you’ve visited the page they reference. You’ve probably seen this before: in some cases, visited links are purple instead of blue. This is just one of the many features web designers use to make the web the best it can be, and for the most part that’s a good thing.

The problem is that appearance can be detected by the page showing you links, cluing the page into which of the presented pages you’ve been to. The result: not only can you see where you’ve been, but so can the web site!

Originally specified as a useful feature for the Web, visited link styling has been part of the web for… well, forever. So this is a pretty old problem, and resurfaces every once in a while to generate more paranoid netizens.

The most obvious fix is to disable different styles for visited versus unvisted links, but this would be employed at the expense of utility: while sites can no longer figure out which links you’ve clicked, neither can you. David Baron has implemented a way to help keep users’ data private while minimizing the effect on the web, and we are deploying it to protect our users. We think this represents the best solution to the problem, and we’ll be delighted if other browsers approach this the same way.

Technical Details.

The biggest threats here are the high-bandwidth techniques, or those that extract lots of information from users’ browsers quickly. These are particularly worrisome since they enable not only very focused attacks, but also the widespread brute-force attacks that are, in general, more useful to a variety of attackers (potentially including fingerprinting).

The JavaScript function getComputedStyle() and its related functions are fast and can be used to guess visitedness at hundreds of thousands of links per minute. To make it harder for web sites to figure out where you’ve been without radically changing the web, we’re approaching the way we style links in three fairly subtle ways:

Change 1: Layout-Based Attacks
First of all, we’re limiting what types of styling can be done to visited links to differentiate them from unvisited links. Visited links can only be different in color: foreground, background, outline, border, SVG stroke and fill colors. All other style changes either leak the visitedness of the link by loading a resource or changing position or size of the styled content in the document, which can be detected and used to identify visited links.

While we are changing what is allowed in CSS, the CSS 2.1 specification takes into consideration how visited links can be abused:

“UAs may therefore treat all links as unvisited links, or implement other measures to preserve the user’s privacy while rendering visited and unvisited links differently.” [CSS 2 Specification]

Change 2: Some Timing Attacks
Next, we are changing some of the guts of our layout engine to provide a fairly uniform flow of execution to minimize differences in layout time for visited and unvisited links. The changes cause all styles to be resolved on all links for both visited and unvisited states, and it is stored; then, when the link is styled, the appropriate set of styles is chosen making the code paths for visited and unvisited links essentially the same length. This should eliminate some of the easy-to-mount timing attacks.

Change 3: Computed Style Attacks
JavaScript is not going to have access to the same style data it used to. When a web page tries to get the computed style of a link (or any of its sub-elements), Firefox will give it unvisited style values.

What does this mean for users?

For the most part, users shouldn’t notice a change in how the web works. A few web sites may look a little different, but visited links will still show up differently colored. A few sites that use more than color to differentiate visited links may look slightly broken at first while they adjust to these changes, but we think it’s the right trade-off to be sure we protect our users’ privacy. This is a troubling and well-understood attack; as much as we hate to break any portion of the web, we need to shut the attack down to the extent we can.

We have to be realistic, though: there are many ways all browsers leak information about you, and fixing CSS history sniffing will not block all of these leaks. But we believe it’s important to stop the scariest, most effective history attacks any way we can since it will be a big win for users’ privacy.

If the remaining attacks worry you, or you can’t wait for us to ship this fix, version 3.5 and newer versions of Firefox already allow you to disable all visited styling (immediately stops this attack) by setting the layout.css.visited_links_enabled option in about:config to false. While this will plug the history leak, you’ll no longer see any visited styling anywhere.

Enhancing Privacy on the Web.

We want to bridge the gap between our users’ expectations of privacy and what actually happens on the web. Sometimes users have an expectation that we preserve their privacy a certain way, and if we can, we want to live up to it. Privacy isn’t a feature that can simply be added to a browser, though; it often comes at the expense of utility. We think we’ve found a fix that will balance flexibility for web developers while providing a safer experience for our users on the web.

Sid Stamm, Mozilla Security

67 comments on “Plugging the CSS History Leak”

  1. Colin Dean wrote on

    Using a checkmark image after/before a visited link is a popular thing. I understand the security precaution, though. Will data: URLs still be allowed in order to preserve this great usability enhancement?

    Case in point: profyle.at uses a small blue checkmark after visited links. Our user testing has found that people like this.

  2. Alex Stapleton wrote on

    How will these changes effect sibling selectors? You mention sub-elements of links but you can restyle siblings of visited links too…

  3. Giorgio Maone wrote on

    Congratulations to David Baron and the others involved.
    Very well thought fix 🙂

  4. Matthew wrote on

    It would be helpful to know what other alternatives you considered and threw out. This article, presenting only your final choice, makes me worry that Mozilla is just picking this approach because it’s easier for you to make people’s websites look worse than to find a better solution.

    Please provide users an option in their browser to disable this and revert to the original behavior of allowing visited links to have all the styles they wanted.

    For me, I’m going to have to retest all my sites for yet another Mozilla-specific CSS style issue and JS behaviior that will retroactively go bad. I have to worry about silly things like where the designer wanted to make his links a point smaller or a new font when they’re clicked.

    1. Sid Stamm wrote on

      @Matthew: Much of the history of this bug is available on bugzilla (Bug 147777). Most of the options considered are debated in the comments of that bug. We did consider many different approaches, but most of what we looked at either made the user’s experience much worse than this fix, or they turned off too many features that developers rely on (outside of just :visited styles).

  5. Wladimir Palant wrote on

    What I am missing out here are the implications for canvas – it can be used to make a “screenshot” of your webpage and read out pixel data, there must be some mechanism preventing visited links from showing up there.

  6. Adam wrote on

    I’m curious what the scale of high bandwidth and quickly are in places like: “The biggest threats here are the high-bandwidth techniques, or those that extract lots of information from users’ browsers quickly.”

    There’s obviously room for disagreement when researchers find things, and I think clarity about the goals would be really helpful.

    Is 10ms per link high-bandwidth? 10 seconds? I’m guessing it’s somewhere in between.

    1. Sid Stamm wrote on

      @Adam: I consider hundreds of thousands of links per minute to be high-bandwidth. Many of the timing attacks tend to take a while to obtain a statistically significant result. Those are low-bandwidth, and finish in the order of tens of links per minute.

  7. helenrae wrote on

    You mention positioning the content itself, but what about a background image?

    Will it still be possible to position an already loaded and applied background image via CSS (i.e. an image sprite containing multiple states) ?

  8. thornmaker wrote on

    Will this fix address non-js based history leaking… e.g. http://ha.ckers.org/weird/CSS-history.cgi ?

    1. Sid Stamm wrote on

      @thornmaker: yes, change #1 takes care of that.

  9. Ulrich wrote on

    A solution for Colin Dean’s problem would be to allow custom background images if the visited links point to the same domain.

  10. Tony Mechelynck wrote on

    Does “Change 2” mean that links won’t change colour anymore if I click “Open in new tab” in the context menu of an unvisited link, then come back?

    @Colin: “Change 1” above seems to imply that adding some “after” element only on visited links (which would change the size of the space occupied by the link) would be forbidden. But maybe displaying a child element of the link as √ with “invisible” colours (fg=bg) if unvisited and “visible” (fg≠bg) if visited would be permitted?

  11. Roger wrote on

    I can’t remember the details now, but I remember seeing reference to some API (maybe related to SVG?) that allowed JS to query the color of individual pixels. It is very easy to position links at absolute positions, so the history leak could be exploited through that… or have you guys also fixed it so it returns the color of unvisited links?

  12. David Baron wrote on

    Allowing data: URLs still allows cases where some images would be slower to paint than others, leading to detectable performance differences.

    One possible way (a bit hacky) to make such images still show up is to have a checkmark image that has the same background color as the page, on a transparent background. Then rules like (supposing the page background is white):
    :link:after { content: url(checkmark); background: white }
    :visited:after { content: url(checkmark); background: blue }
    would still work.

    Depending on how much feedback we get about this, we might try to figure out a way to make this possible more easily. (It might be easier to support changing background-image than content.)

  13. David Baron wrote on

    (My previous comment was a response to Colin Dean in comment #1; seems there’s a good bit of caching delaying the comments showing up, so the ordering might be more confusing than I expected.)

  14. Alex wrote on

    I could still use color to steal history data, just less of it.

  15. Damian wrote on

    Colin: Alas this will no longer be allowed as it’s very easy to use that same method to then extract the user’s history.

    However, I don’t see why there can’t be an exception for visited sites of the same website, surely the website can see what the user clicks on anyway within their own site.

  16. Remco wrote on

    I think it would be possible to keep the feature of an image for a style. Basically it goes like this:

    * always load any graphics that are used in visited and unvisited achors.
    * use the same content box for visited, as for unvisited anchors. Then, clip the visited image if it’s too big for the unvisited content box. Pad the content box if it’s too small. This would mean that sites need to make the image a child of the anchor (so not style it with :after, like on profyle.at), or use a background image. So some sites would break. But it would at least keep the possibility.
    * give javascript functions the ‘unvisited version’ of the page.

  17. Steve Krenzel wrote on

    While I like the approach taken here, the usability changes and impact on web developers can be significantly minimized with a minor change to the implementation:

    Only apply these rules to URLs not on the current domain.

    Most pages that legitimately use visited styling do so for links on their own domain. You shouldn’t need to restrict the styling on these links in any way.

  18. David Baron wrote on

    @Wladimir Palant (comment #5): the ability to make a screenshot using canvas is available only to chrome and extensions, not to Web pages; it would have other significant security and privacy implications other than this.

  19. Apphacker wrote on

    Maybe allow for same domain policy so that web developers can still add fancy styling for links within their own domain.

  20. Michael wrote on

    As a person with a color vision deficiency Change 1 will make Firefox completey useless for me. I hope the Chrome people are a bit wiser and don’t follow this nonsense.

  21. Michael wrote on

    Sorry, I’m really angry about that. I think Change 2 and Change 3 are very useful, but I’m completely against Change 1.
    Please be aware that this will affect a significant portion of your user base, see http://en.wikipedia.org/wiki/Color_blindness#Prevalence

  22. Matt wrote on

    “However, I don’t see why there can’t be an exception for visited sites of the same website, surely the website can see what the user clicks on anyway within their own site.”

    I’d like to second this comment. Please provide an exception to sites within the same base domain. This way javascript can do handy things with those links.

  23. Jaanus wrote on

    My question is: what is the point of visited links feature nowadays? Would anyone be worse off (or notice) if this “visited links” feature were simply eliminated from the browser? These days, with rich content and URL shorteners and whatnot, it has lost its usefulness in my eyes. Does anyone have any evidence (scientific, anecdotal, whatever) that people know or care about this feature? (Especially the kind of people featured in “what is a browser” video.)

    How this relates to security: simply eliminating a feature would save everyone the trouble of trying to plug perceived or real security holes related to that feature.

  24. Colby Russell wrote on

    Michael: “completely useless” is surely an exaggeration. Having said that, the Web is certainly just as useless in its current state, no? Web designers have been putting personal taste for aesthetics before the usability for others for a long time, so in truth, you’re already running into problems with low contrast or otherwise indistinguishable color palettes in the wild, right? And even amongst designers who do not fit in those categories, the number of designers who put icons or even non-color types of text decoration in their style sheet to distinguish visited links is by far outnumbered by those who only specify color.

    Using an add-on to help with your color vision deficiency would probably have the highest benefits for you across the board.

    David Baron: Is/will there now an API in place (getTrueComputedStyle?) available to chrome? It seems like it would be a shame if add-ons didn’t have an easy way to get at that information; I know the DOM Inspector uses the DOM 2 Style API in question for the computed style viewer. (Certainly no problem if you’re only interested in seeing exactly what is going to be seen by script, but if that’s not your use case, then a problem does arise.) And it would probably be most beneficial to have a separate method rather than a whitelisting effect for chrome, in case your add-on is interested in seeing exactly what script will see.

  25. Mike Samuel wrote on

    JavaScript sandboxing schemes can prevent sandboxed code from doing history sniffing, port sniffing, etc.

    Caja ( http://code.google.com/p/google-caja/ ) addresses history mining by exposing only the non-visited version of styles for links to untrusted code.

  26. Colby Russell wrote on

    Sid, it’s also interesting that you say you “hope other browsers will follow suit”, since the approach makes it seem more like a (belated) temporary stopgap than a permanent solution, and will potentially be subject to future progress which could retain more of the current functionality to be exposed to authors—contingent upon some untangling in /layout that would make such things possible.

    1. Sid Stamm wrote on

      @Colby: I wouldn’t call this fix a temporary stopgap, unless you are hoping this it a stepping stone towards removing browser history entirely, which I doubt would be welcomed with open arms. With the text you quoted, I was saying how I’d like to see more adoption of the fundamental approach: blocking history from arbitrary web sites, but still letting users themselves see it. The point is to match up what the browser does with what users think it does. Browser history is widely perceived as something that’s just part of the browser–not transmitted to sites–and this patch is intended to make it that way.

  27. Anna wrote on

    Please provide an exception to sites within the same base domain or explain why it’s not a good idea.

  28. Frank Yan wrote on

    I don’t think text-decoration (at least underline) affects page layout. If I am correct, why isn’t this supported for :visited?

  29. Ben Curtis wrote on

    Please consider only applying these changes to links that specify a domain.

    These changes look like they will break the (very useful) types of :visited links with checkmarks (e.g., a symbol saying “check — you’ve followed this link”), or otherwise calling deliberate and unmistakable attention to links you have not gone to yet and those you have. For example, a list of PDFs that must be viewed/downloaded could be marked with a.pdf-list:visited { text-decoration:line-through; }; such a notation is not just a design aesthetic, but a significant usability aid that would otherwise require convoluted server-side code to implement.

    So I suggest that these new changes only apply when the link starts with a protocol or double slash, and therefore specifies the domain. I don’t even think it would be necessary to compare the domain to the page’s domain; ANY specified domain will trigger these privacy protections on that link.

    If it doesn’t specify the domain, then the link must point within the domain. For the majority of sites which fully control all pages on their domain, your suggested changes do not provide any additional privacy protection since all of that history can be tracked on the server. But the changes would prevent creative and useful applications of CSS. (Sites where untrusted third parties are hosted under the same domain, e.g., apps.facebook.com, would be the exception — but Facebook controls what JS can run and can easily protect against this themselves.)

    Hope you consider this modification.

  30. James wrote on

    From Webkit’s bugzilla, it looks like Dave Hyatt is going to implement the same fix. (The great thing about a competitive browser market.)

  31. Jesse Ruderman wrote on

    Will user stylesheets be affected?

  32. Eris wrote on

    Hang on, wait a tick. Why not treat link styles using the same-origin policy that affects XMLHttpRequest and Javascript interaction with frames? Visited links that fall under same-origin are styled freely and those that don’t are styled as unvisited.

    Actually, I’d like to see this made part of the CSS spec and add a pseudoclass for all links that point to a resource that belongs to another domain. Don’t suppose it will ever happen, though.

  33. Daniel Veditz wrote on

    @Jesse: yes, user sheets and even chrome sheets are affected. The engine simply doesn’t support :visited styling except for a limited number of properties.

    @Frank Yan: text decoration isn’t supported out of concerns about timing attacks. https://bugzilla.mozilla.org/show_bug.cgi?id=147777#c160
    See also “Test #3” in that bug which only uses the underline property.

  34. Colby Russell wrote on

    @Sid (emphasis is mine):

    I’d like to see more adoption of the fundamental approach: blocking history from arbitrary web sites, but still letting users themselves see it.

    Certainly. And you probably do know that I wasn’t suggesting removing history entirely.

    But your call to follow suit seemed a bit stronger than the adoption of fundamentals; it seemed like one for an adoption of the specific process used in Gecko—the rules it applies to determine the restrictions on :visited and how it will be accessible to content script. Given that many people are seeing this as a loss in functionality, David Baron’s past expressions that indicated he envisioned at least a little more flexibility would remain with regard to what web authors would be allowed to do with the selector, and that the bug is still marked ASSIGNED, as well as my own thoughts that it seems like more options could be preserved, albeit with some more work, it appears to me to be a temporary stopgap set to be improved upon.

    For what it’s worth, this affects me in absolutely no way with regard to Web authoring, and I’m not sore over it. (Did my disdain for designers who put an overemphasis on their personal taste for aesthetics shine through before?) The aspects I’m far more concerned about are ones I share with Jesse: how is this attempt to thwart evildoers on the Web going to affect already-privileged chrome?

  35. Watches wrote on

    I hope users stylesheets aren’t hit by this.

  36. AJ wrote on

    Sid,

    Haha, that’s a good one! I almost fell for it because I’m in a different timezone, but when I got to the end I realized it’s an April Fools prank, and a classic one at that! But I don’t think that more experienced Web developers will be fooled — Bug 147777 has been around long enough that any proposal to do something about it is about as like to happen as having Duke Nukem Forever implemented as a Firefox plug-in. Good try though 🙂

  37. Philip wrote on

    I don’t thing that you should disable the background-image property for visited links totally. It should be allowed for links that point to the same domain. I know some sites that use background-image for visited links and it would be pity if they won’t work as they usually do.

  38. Tom wrote on

    Really a good one. While my first thought was: MUST be an april’s fool, I stumbled across the fact that the date just wasn’t right. But as AJ already mentioned: timezones. Damn globalization! You really got me with this one!

    Good job!

  39. Davin wrote on

    No you can’t. You have to have a list of canidate links in the first place and there are far too many profiles in Facebook.

    Someone (I can’t find the link now) put together a proof of concept using popular Facebook groups. They had a pretty good hit rate on identifying users from their unique combination of groups, I believe it was above 50%.

  40. ant wrote on

    Will this also be used to cripple pages that run no JS code whatsoever?

    1. Sid Stamm wrote on

      History can be sniffed without JavaScript too, just using CSS. See this demo for an example and explanation of how it works.

  41. Dood wrote on

    This sounds to me like a hard-to-implement, hard-to-maintain and quite unreliable solutions, I am quite confident though it won’t break anything important…
    Still, can somebody please summarise for me the arguments against the SafeHistory approach?
    (I guess you discussed this not only in the Bugzilla entry but also on IRC and etc.)

  42. mogya wrote on

    Why don’t you handle such site as “bad site” on the Malware Protection?
    The sites using “the CSS History Leak” are malicious site, aren’t they?

  43. Kulmegil wrote on

    Soo… will the new hack block completely the possibility to determine visited links? – not only by getting node and it’s children “computedstyle” but also by getting it indirectly from PARENT node (by checking it’s computed height for example)?

  44. Sai Emrys wrote on

    Could you please fix the link for my results page from my blog repost to http://cssfingerprint.com/results (the original page)?

    Thanks.

    1. Sid Stamm wrote on

      @Sai Emrys: Of course! Thanks for the link.

  45. Sai Emrys wrote on

    @Adam My attack (which Sid kindly linked to) is AFAIK the fastest one currently out there. My current throughput using reliable methods is (local to the browser):
    Chrome: .04 ms/URL, 1,500,000 URL/min
    Explorer: .26 ms/URL, 227,000 URL/min
    Firefox: .10 ms/URL, 553,000 URL/min
    Opera: .09 ms/URL, 640,000 URL/min
    Safari: .02 ms/URL, 3,690,000 URL/min

    IOW, it’s quite a lot faster than the fastest you thought.

    There are some other issues that are preventing me from actually doing that much throughput end-to-end; a typical scraping tests ~80-100kURLs 4x each (on my dev box I’ve gotten up to ~250*4k). But that’s just a temporary hurdle. The severity of this hole is quite significant.

    My code is entirely open source, so if you want to know how the scraping part works, just look at http://github.com/saizai/cssfingerprint/blob/master/public/javascripts/history_scrape.js

    Feel free to visit http://cssfingerprint.com if you’d like to see the effects.

    FWIW, I think that DBaron’s approach is fairly solid. I don’t think that there is anything that can be done short of what he’s doing in terms of breaking expectations of usage, while still fixing the bug. As the post says, it’s an unavoidable trade-off.

    Of course, I’ll also be one of the first to try to break his code, just in case I’m wrong about that. 😉

  46. Otávio wrote on

    I agree with “36 – Eris”, and thinks even better, why you just hidde the src attr in the A tag, if it’s from other domain, show an about:blank or anything like that.

  47. Dhouwn wrote on

    BTW: What about subpixel positioning, is there a chance that a different link colour might interact with this?

  48. Ferenc Veres wrote on

    What about background-position?

    Can’t we keep that, so using “CSS Sprites” for checkmarks and other visual – also color blind friendly – styles would be possible? One could use that to stroke visited PDF links I think, as requested in a comment above. Does it change anything detectable?

    Now that we know, COLOR will stay, could you set a better visible link color for this blog and another (different) color for visited links? Thanks.

  49. Edward Jones wrote on

    I don’t understand how Michael (comment 23) can say that minus the coloring of links a browser becomes completely useless. While I do sympathize with his condition which I’m sure presents many challenges in life, a web page does not suddenly stop working if you can’t differentiate between visited and non-visited links. I am confident that he would still be able to use any browser even if no difference in the styling of visited vs. non-visited links was presented. I am the first person to stand up for accessibility and design my websites to be usable by those with screen readers for example. I think the absolutism of his comment does a disservice to the accessibility movement.

  50. Adam Messinger wrote on

    For all of those asking about allowing broader :visited link styles for links within the same domain — this will still have privacy implications on sites like LiveJournal and WordPress.com. Many users share the same domain on those sites, and it would be possible for one LJ user (for example) to determine all the other LJs a visitor had viewed.

    Though I understand the need to fix this privacy problem, I’m among those who are less than thrilled at the constraints being imposed on front-end designers and developers. Hopefully a less limiting solution will be found at some point in the future.

  51. Chris wrote on

    Why is this even a privacy issue? It’s not like someone is going to have some gossip site that tells the sites that certain ip addresses visit.

    The only people I can think of that would maybe have a rational reason to be concerned about the web surfing privacy of their ip address is people who are doing something illegal online such as going to websites to get child pornography or something. But why would we want to protect them anyways?

    This is just as stupid as people thinking their privacy is being violated when their DNA is on file after an arrest when they’re found not guilty.

    In both cases there would be no harm done for people that deserve no harm.

  52. Justen wrote on

    I have mixed feelings about this. Privacy is a personal responsibility; giving people the tools to enhance their privacy is one thing, but using hamhanded techniques like this is unlikely to help very much and may cause headaches for designers and developers who legitimately use the features you’re about to axe. If you really want people to have better privacy, just give them better user interface tools to protect it. You could do things like provide a button in the main UI to block the presently visited site from appearing in history, another to block the present site from accessing history information, and another to clear all history. Give them more granular control of what kinds of things get saved to history and for how long via an intuitive control panel.

  53. Daniel Veditz wrote on

    @Chris

    The main demo sites know nothing about “you”, so maybe the fact that they know where “you” visit is merely interesting. But sites that already know more specific things about you (because you have an account with them, for instance) can now correlate all sorts of things with a more specific notion of “you”.

    This does far more than catch people doing illegal things, criminals are not the only ones with “something to hide”. Examples:

    A hacker can use this to figure out which online bank you actually use and present a more believable phishing attack.

    Online stores could show you higher prices if they notice you visit high-end online stores and cheaper prices if you visit walmart.com (Amazon, among others, has at least experimented with showing different prices to different users, though their technique is unknown and probably isn’t using CSS history).

    A military site (where you’re required to authenticate) might find out you’re gay and ruin your career even though you’ve been careful not to “tell”.

    A blog might show only the sharing icons (digg, reddit, facebook, etc) for the services you use rather than a dozen or two confusing little icons (this one might actually be positive).

    Web ads could be better targeted at your demographic (possibly good or creepy)

    criminal groups with websites/discussion groups could use this to “out” undercover cops or informants.

  54. Chris wrote on

    @ Daniel Veditz

    Good points. I feel stupid not thinking about sites that you signed up for.

    The hacker scenario would be a security problem, caused by a privacy issue. A good point.

    The online stores showing higher prices to certain people and the undercover cop issues are also very good points.

    I can think of other negative situations that could arise from this privacy issue now as well.

    I should have thought on it longer. Thanks for setting me straight 🙂

  55. Bruce wrote on

    Like everyone else, I really hope that Mozilla developers and other browser developers will only limit styling on visited external links.

  56. Mitchell Evan wrote on

    Ditto @Matthew. Make the security improvement the default, but allow users to override it by browser configuration. This will go a long way to addressing the accessibility issue.

    Ditto the many requests to make the changes apply only to links to external sites. But we probably want to define external as “untrusted” instead of “other domain”, in order to support a browser’s list of trusted sites e.g. corporate configuration of trusted intranet sites.

  57. bpjonsson wrote on

    I just want to point out that if color is the only means available to differentiate visited links from unvisited ones then the best still possible way to be nice to people with color vision problems is the one which always was most effective:

    Use high text/background contrast an let :visited be inverse video relative :link.

    This is not much used even now in spite of being most effective; no doubt because inverse video is dead ugly and jumps out of the page/screen, and probably not only because we’re unused to it.

    Since I’ve got (non color) vision problems myself I’ve thought a lot about the issue of differentiating (un)visited links from each other clearly without depending on color, and without having to sacrifice the traditional uses of font properties for emphasis. Whatever preferences I had are out now!

    If color is the only remaining way of differentiating links from non-links at all we’re really in trouble…

  58. izdelava strani wrote on

    I really hope that you’ll do the job and even most that the rest will follow. Cos as a website developer i’m feed up with all the browsers that need to be satisfied so that the web site looks good in all of them.

  59. Pat wrote on

    Pat from AddToAny. We’ve deployed this technique on our sharing widget for years (search for “addtoany smart menu”) so this does indeed affect us and over 100,000 publishers. Personally, I’m okay with plugging this hole to a certain degree, but the aforementioned seems like a silver-bullet approach with too many developer implications.

    AddToAny’s script, for instance, queries against URLs from 200 sharing/bookmarking services to place visited services at the top of the sharing menu. It’s a marvelous use-case, I might add. 😉 200 queries is not “high-bandwidth” as defined above, but it’s noteworthy. Just FYI the results are used on runtime, client-side only.

    My thoughts:

    Have we discussed defining a ceiling for high-bandwidth queries? A maximum number of queries doesn’t cover all privacy implications of this hack, but it would plug the more infamous and nefarious attacks. AddToAny would certainly favor this approach.

    Adopting a same-origin policy on this issue definitely makes sense to me. @Adam Messinger re: your wordpress.com example: That’s why wordpress.com does not (and probably never will) allow arbitrary JavaScript from publishers. Not sure about LiveJournal, but most sites of this nature don’t permit arbitrary JavaScript due to a slew of issues extending beyond this one.

    Regardless of outcome, this surely is an exciting development and we’ll be monitoring the conversation here and at bugzilla. Please ping me if I miss anything or if you’d like to chat directly. Twitter @micropat or pat at addtoany. Cheers!

  60. Bruce wrote on

    Can’t Mozilla simply disable the ability of javascript to determine all the styles on visited links instead of disallowing those styles?

    1. Sid Stamm wrote on

      @Bruce: if we just disabled access via JS, that wouldn’t solve any of the timing attacks or the non-JS CSS-based attacks (those that rearrange the DOM, resize things, or create requests for images). For example, take a look at http://browser-recon.info. The fact that there are so many ways to access the history, with and without JS, makes it necessary to address the capabilities and not just the information presented to JS.