Categories: Security TLS

Deprecating Non-Secure HTTP

Today we are announcing our intent to phase out non-secure HTTP.

There’s pretty broad agreement that HTTPS is the way forward for the web.  In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications, which in the case of the web means HTTPS.

After a robust discussion on our community mailing list, Mozilla is committing to focus new development efforts on the secure web, and start removing capabilities from the non-secure web.  There are two broad elements of this plan:

  1. Setting a date after which all new features will be available only to secure websites
  2. Gradually phasing out access to browser features for non-secure websites, especially features that pose risks to users’ security and privacy.

For the first of these steps, the community will need to agree on a date, and a definition for what features are considered “new”.  For example, one definition of “new” could be “features that cannot be polyfilled”.  That would allow things like CSS and other rendering features to still be used by insecure websites, since the page can draw effects on its own (e.g., using <canvas>).  But it would still restrict qualitatively new features, such as access to new hardware capabilities.

The second element of the plan will need to be driven by trade-offs between security and web compatibility.  Removing features from the non-secure web will likely cause some sites to break.  So we will have to monitor the degree of breakage and balance it with the security benefit.  We’re also already considering softer limitations that can be placed on features when used by non-secure sites.  For example, Firefox already prevents persistent permissions for camera and microphone access when invoked from a non-secure website.  There have also been some proposals to limit the scope of non-secure cookies.

It should be noted that this plan still allows for usage of the “http” URI scheme in legacy content. With HSTS and the upgrade-insecure-requests CSP attribute, the “http” scheme can be automatically translated to “https” by the browser, and thus run securely.

Since the goal of this effort is to send a message to the web developer community that they need to be secure, our work here will be most effective if coordinated across the web community.  We expect to be making some proposals to the W3C WebAppSec Working Group soon.

Thanks to the many people who participated in the mailing list discussion of this proposal.  Let’s get the web secured!

Richard Barnes, Firefox Security Lead

Update (2015-05-01): Since there are some common threads in the comments, we’ve put together a FAQ document with thoughts on free certificates, self-signed certificates, and more.

288 comments on “Deprecating Non-Secure HTTP”

  1. rott wrote on

    are we going to be like in “Idiocracy” movie ? all the people in charge are dumb??

    – MS wanted to change the way people look at computers with win8 and they found out it was their way only and people decided to remain on w7
    – Mozilla wanted to join the “identical interface” (chrome, opera ) of a browser with Australis and they lost people ( i switched to pale moon and while i like chrome`s interface i loved the previous firefox also, but not the new one)

    now mozilla moves to cripple sites with their browser, enjoy your drop in users

    ps. MS had the advantage that there is no real OS that can compete with windows and the choices were stay with old w7 and or move the w8 so there was no real drop in users on “windows” but in browser area there are some very good choices

  2. roman wrote on

    Disabling features that used to work is a really bad idea.
    I recommend Firefox for various http intranet sites. Having some features there stop working because of a browser update would be extremely annoying.

    This will also have a chilling effect on small websites. Having to manually get a new cert every year and reading through pages of CA terms of service etc. (I do read them) is not something I’d put up with for every website.

    Even if we get services like letsencrypt or the like, this move is going to make the web a lot less open and decentralised.

    I’d at least expect Firefox to get a lot less hostile toward selfsigned certs.

  3. Adrian wrote on

    While I really love the idea of widespread encryption, I believe there are cases where it really doesn’t make much sense.

    For example, I own a website where I do nothing besides showing my portfolio of photos. The only interactive element is a contact form. Encryption doesn’t really add any advantages there.

    1. John Teague wrote on

      The advantage is in the utility of an overall safer web ecosystem, not about one site. Think vaccinations.

      1. Valerio Bozzolan wrote on

        IRL if you don’t want “vaccinations” you are not eliminated by the doctor ._.

  4. Unary Negation Operator wrote on

    When I read that you want to enforce HTTPS, I thought: “Oh the governments would be happy about it”, and next thing I see, you admit to being seduced into this decision by the governments.

    I’m sorry Mozilla, it seems like you forgot your purpose and origins.

    Did you mix the dates maybe? This was supposed to be published on April 1st?

  5. Unary Negation Operator wrote on

    And if anyone has doubts: the next step will be to enforce USERS to have certificates, that uniquely identify them. Because security, wink wink!

  6. QJ wrote on

    I definitely advocate this intent.

    Some comments say “I don’t need HTTPS because my site is non-commercial and even without log-in page”.
    But I should point out HTTPS brings other benefit besides security.

    For example, if I am a MITM, I would be happy to insert js scripts to pop-up an AD on your HTTP website, or substitute my Baidu AD for your Google AD.
    Also, I can easily redirect visitors from your site to other sites with my aff link, making visitors believe it’s your site’s fault.
    These are very common in all ISP in China. HTTPS mitigates such behaviors.

    1. Zed wrote on

      What if it’s literally a sandbox for experimenting as a developer? (Read: Need access to features) Plus my own blog where I write text and have no ads or images?

    2. brian wrote on

      Yeah, and that’s why I use NoScript. Also, MITM is not the only way to inject javascript into websites.

  7. Toady wrote on

    Interesting idea but how does this affect things like Google Adsense as the code is served under HTTP witch gets blocked under HTTPS as mixed content, Does this mean that Google will now add support for there ads to show on secure HTTPS page?

    If so how does this far against the whole security concept with depreciating HTTP is Google phasing out there adsense?

    How will ads on secure pages fare will they become the perfect proxy to steal data entered on secure pages?

    1. Zed wrote on

      Good question!

    2. Daniel Veditz wrote on

      I found a reference to the following help page in a 2011 blog comment, AdSense has apparently supported HTTPS for quite a long time:

  8. Justin C. wrote on

    Reading the comments below, and my answer to nearly everyone is “what a bunch of FUD.”

    I pay MORE for my domain name (~$9 US/year) than I pay for a cert. I can easily get a COMODO single-domain cert for ~$6 US per year, depending on how long the cert lasts. With “Let’s Encrypt” becoming a thing this year, even that argument doesn’t hold water.

    The tin foil hat-wearing people talking about the government revoking your certificate because you’re being critical or whatever (especially the comparison to China) are absurd. The government isn’t involved in your certificates, nor are they involved with the certificate authority (unless you’re using one of the US Department of Defense CAs, in which case… Yeah, you WORK for the government.)

    Get real, people. You’re acting like every website out there that needs to be secure is currently secure, which is obviously not the case. The only way to properly secure the sites that need security is to enforce security through the whole protocol, which is what this is aimed to do.

  9. Nick wrote on

    I understand it has it’s merits but it’s not a one fit suits all situation. It’s up to the owners of the website(s) to make that decision not for it to be forced upon people. Does this now mean that if I want to build a website within my own development environment I need to have HTTPs certificates setup test all it’s features before it’s launched?

  10. Fabio Muzzi wrote on

    You are 30 days late. April’s fool day was last month. Please be serious.

  11. ziogianni wrote on

    Have you considered that someone can re-compile from the source ? So that http can “reborn”? 🙂

  12. Gilberto Persico wrote on

    Are you nuts ? April’s fool was a month ago, please focus on fixing things, not on government or multinational requests, if you are still an open source foundation.

    1. Giuseppe wrote on

      I honestly don’t see the point here. I think a malicious javascript can run either with or without a certificate. You can even have a certificate for free, so I reaaaally don’t see the point here.


      1. Anonymous wrote on

        Without HTTPS it is much easier for a man-in-the-middle to inject malicious JavaScript in pages of innocent websites they don’t control.

        1. Erkki Seppälä wrote on

          And what is that malicious JavaScript supposed to do? From the user’s point of view, anything they receive from a maliciously MitMed HTTP site could just as easily received from a perfectly HTTPS-secured site that just provides the same content by itself.

          It is simply a matter of considering what kind of data one chooses to send to a site, encrypted or not. If you expect confidentiality, you may choose to use encryption.

          Related to this is the China firewall attack on Github. While it was unfortunate, I don’t think in the end it was something HTTPS would have saved. After all, the China can simply require that everyone uses either their HTTPS certs or they don’t use any HTTPS certs.

          1. Christian Parpart wrote on

            Erkki, it is about saving the end-users privacy.
            Also, a bad javascript can fully modify your complete webpage at own will. Cheers.

          2. Matthew wrote on

            Maybe it would not have stopped China, which can use state power to coerce certification authorities. We also know that NSA has been able to insert itself into certification authorities as well. However, most attacks aren’t states, they’re mostly teenagers just screwing up sites for the fun of it. Universal HTTPS would be pretty effective against that.

          3. Josh wrote on

            SSL/TLS also provides a way to authenticate a site as being controlled by the people the CA validated. MiTM in a world where everything is SSL requires some trickery such as squatting on similar misspellings of domain names and using certs made of those, or using certs from self-signed CAs or other CA’s that aren’t usually in the web browser to begin with. All this together makes it a lot harder to MiTM a site that’s using an SSL cert from a reputable CA. It’s not just about encryption.

      2. Robert P wrote on

        It is so freakin easy to MITM connections though. It makes it much more difficult to MITM useful sites when they have HSTS, upgrade CSP attribute, public key pinning, and secure cipher suites.

    2. Benjamin Smith wrote on

      Honestly, rather than actually deprecate http sites, why not pick a *sane* warning system?

      Any site that isn’t encrypted or refers to resources that aren’t encrypted SHOULD HAVE AN ANGRY RED ICON on the address bar.

      Any site that is encrypted but isn’t perfect should have a YELLOW icon because you have some protection. EG: 1024 bit key, stale or self-signed certificate, etc.

      You should see LIGHT GREEN for the “basic” SSL sites.

      You should see DARK GREEN for the “security enhanced” SSL sites.

      The current warning system is lame – it indicates that all is well when you are the *least secure*.

      1. Daniel Veditz wrote on

        In parallel to this effort, or indeed as part of it, we are working on redesigns for the site indicators. As Richard said in the title, “deprecating” insecure HTTP. Part of deprecation is giving people warnings when they’re using a feature you’re trying to phase out.

      2. Cody wrote on

        SSL certificates ARE the warning system in HTTPS…

    3. Ted Kraan wrote on

      I am still looking for the 1st of April joke/reference.

      HTTP was never invented to be secure. Analogies would be ‘Let’s ditch forks because you can’t eat soup with them’ or car analogy ‘Let’s ditch cars because they can’t fly’.

      And the movement is all wrong too. Where is this free, open and transparant web we used to have? In the future the internet will be controlled by a few mega-corporations, which will feed it grey uncreative uninspired junk.

      Why not attack the root of the evil. The Javascript/ActiveX/Actionscript engines that let a hacker on the other side of the world install malware, because you navigated to a wrong site? Why was that never addressed? More curiously, who invented that stuff? Push installations from a remote host through a browser? What were they smoking when they thought of that?

  13. Lestat wrote on

    What about hobby projects which do only offer static HTML pages? This is a whole discrimination of small webprojects which either see no reason moving on towards HTTPS or have neither time or the necessary knowledge.

    What you propose is a discrimination against simple Webpages which are no danger. You lose control over reason more and more.

    Switching to another browser now, good bye!

    1. Brian wrote on

      All of the service tiers that Microsoft Azure offers for websites, including the free tier, have had an HTTPS endpoint for years. It’s transparent to the individual website and requires zero effort or change on the part of the web developer; you get a Microsoft certificate. The only work you need to do is if you decide to have your own domain name.

      If Microsoft can do it, every hosting service can too.

      1. Kirrus wrote on

        Microsoft have a CA cert. They can automatically sign certificates. Mom & Pop hosting don’t, and can’t.

        This is not a good thing for small hosts, nor small webmasters.

        1. Cody wrote on

          Hi Kirrus,
          You’re wrong. Mom and Pop can generate SSL certificates for free with StartSSL after a 50$ verification… If they can’t afford 50$, they probably shouldn’t be running a business.

          1. Cobab wrote on

            When I was younger, there were people on the internet having webpages while not running any business. It’s almost over, and I see Mozilla want to finish the job and kick us all out.

        2. Ron E. wrote on

          Cloudflare also allows free SSL certs. Learning to properly setup SSL takes maybe a few hours if doing it manually

      2. Zor wrote on

        This is not the answer I expect from Mozilla. Thanks for pushing Big Corp “Cloud” Providers agenda screwing over small providers. Nice way to defend the “open” web.

        1. Daniel Veditz wrote on

          We are pushing for secure communication on many fronts. Richard is describing eventual plans for the browser. At the same time Mozilla is supporting the nascient LetsEncrypt effort to provide easy-to-manage free certificates to domain owners. All the parts have to work hand-in-hand before we go from the “intent” described in this post to shipping.

    2. kobaltz wrote on

      Use CloudFlare with their Free Tier. You get free SSL support without having to do anything. As far as the browsers are concerned, it is served over SSL. The connection between CloudFlare and your Web Server would not be encrypted though.

      1. LvH wrote on

        And CloudFlare can then monitor all the traffic. 😉
        Any site using SSL behind CF, including those that chose Full-SSL and thus provided their private key, can have the data encrypted by CF constantly as a MITM. So how exactly is that more secure than no SSL at all? Ok, it gets harder to sniff packets on the local network. But for really sensitive data, it serves no point.

        Just as much that SSL actually serves no point at all for static HTML websites. Why on earth would you want to encrypt that? Complete waste of resources.

        1. Daniel Veditz wrote on

          You can’t have it both ways! Yes, CF’s free tier is inappropriate for “really sensitive data”, but so is a completely unencrypted connection which was the subject of this article. Short of that type of data–for which you need a real certificate for regardless of Mozilla’s plans to deprecate http–then “harder to sniff packets on the local network” is a big win.

        2. Pavel wrote on

          There are lots of wifi hotspots that inject or replace ads in every HTML page requested by clients. And there are ISPs that sniff and sell user traffic data (URLs visited and cookies set by advertisers). And malware, of course. This is not a teoretical nor a spy movie threat; automated HTTP interception is a lucrative business nowadays.

    3. James Patrick wrote on

      The Lets Encrypt project should take care of both of these aspects.

  14. Patrick Lambert wrote on

    What about the fact that SSL requires a unique IP per site, or the ability to support SNI, which requires a specific set of OSes? For example, Windows Server 2008 R2 cannot use SNI, and that is (and is likely to remain for several years) the most popular enterprise server in the world.

    1. Ed Burnett wrote on

      The vast majority of the web runs on Linux, BSD and variants. If a company makes the business decision to buy into proprietary software, then they should factor in all of the licensing, obsolescence, lock-in, and long upgrade cycle issues that go along with it.

      1. hron84 wrote on

        Yep, but you talking about new servers. What about old, already existing servers? Because if anyone buy a new server, it will be shipped with new os, however upgrading an existing server is a whole different thing, an it could depend on the existing infrastructure too. I think Mozilla should not make infrastructure decision for unknown companies. If there’s an old, Win2008 server, then there’s a cause for its existence.

        From other aspect: as long as Microsoft supports the existence of Windows 2008 R2, Mozilla should support it too, regardless about personal preferences.

        1. Ed Burnett wrote on

          We’ve reached a point on the political world stage where unencrypted traffic is a danger to human rights. I believe Mozilla is implicitly acknowledging that this is a higher priority than legacy systems in use by some institutions that may present some difficulty and expense in replacing.

          1. LvH wrote on

            How is it a danger to human rights when it concerns static public HTML? Seriously?
            From the packet headers you can still see what site and page is being visited, and no SSL encryption in the world is going to change that… You’d have to tunnel the traffic first to achieve any gain there.

            Surely, SSL is better for sensitive data. Absolutely.
            But it serves no purpose to force it everywhere… Not a single purpose at all.

            1. User wrote on

              That’s not how SSL/TLS works.

            2. Ed Burnett wrote on

              Actually, HTTPS packets do not reveal the requested URL/page. Only the server address.

              The benefits to privacy and security for all sites are many, and the technical impact is negligable. The NSA has well documented their interest in people who merely read certain technical articles. Not to mention by making a TLS connection you are assured that the data you are receiving is authentic, has been delivered by the verified owner of the domain, and has not been manipulated in transit.

            3. Gabriel Corona wrote on

              > Actually, HTTPS packets do not reveal the
              > requested URL/page.
              > Only the server address.

              And the server (vhost) name thanks to SNI.

              And the protocol thanks to ALPN.

  15. lozl wrote on

    1) then you should have ON/OFF big button for compatibility toggle
    2) make a way with ONE CLICK to allow self-signing certificates not current clicking nonsense
    3) @startssl PR of israel-based company: sftu

  16. Sven Slootweg wrote on

    My response to this, given that it’s a little too big to fit into a comment:

    1. Dianne Skoll wrote on

      Sven Slootweg’s article is excellent. This is not a good idea and doesn’t buy any actual security as long as browsers trust a whole bunch of CAs. It just takes one crooked or compromised CA to crash the whole house of cards.

    2. Daniel Veditz wrote on

      You’re right that a browser deprecating insecure HTTP isn’t sufficient to create a secure internet, but Mozilla (and others) are pushing on many fronts. Mozilla is tightening up requirements on our CA program, for example, and while there’s still room for improvement we have eliminated cryptographically weak root certificates, required stronger signatures (killing MD5 hashes and on track to kill SHA1; increasing the minimum key size), tightened the audit requirements on CAs (which has resulted in the de-listing of at least one CA), and so on.

      You’re wrong that certificate pinning is “still not actually implemented by major browsers.” It’s implemented in Firefox and Chrome which represents about half of browser usage. Users have a choice to use a more secure browser, and even if they don’t switch Firefox and Chrome users act as canaries detecting bad certs on popular sites and ultimately protecting users of all browsers (against general attacks; obviously a carefully targeted attack could bypass this).

  17. pyalot wrote on

    Are you fucking insane?

  18. Chris Star wrote on

    For a lot of static websites https literally does nothing except burden the webmaster, as a third party can still see which websites are visited.

    You shouldnt punish people like a local bakery for publishing a simple website where they show off their bread or a local archeologist who shows his findings online, for the greater good. The internet should welcome tech illiterate newcomers to share information online.

  19. rtechie wrote on

    As someone who does a lot of work with PKI, I think this is an extremely bad idea.

    Making HTTPS mandatory will seriously degrade the security of existing web sites.

    Right now, the main problems with SSL/TLS have to do with bad actions by root Certificate Authorities (like China’s CA) issuing inappropriate or questionable certificates.

    You’re assuming that site operators, and more importantly users, are going to use HTTPS intelligently and appropriately and that’s a bad assumption.

    Forcing every single site to use HTTPS means that unless that site has a root CA cert, users will get a browser error. And we’ve “trained” users to avoid sites with browser errors. This will create a “gold rush” with the root CAs as lots of smaller sites start requesting certs. This will inevitably lead to more bad certs being issued.

    And there will be a LOT more questionable certs issued.

    Because you intend to block features behind HTTPS, you’re making it impossible to TEST using HTTP, so every single internal, QA, or test site needs a cert. Sure, they can use self-signed, but users will get a browser error. So now either that organization has to run their own CA or get more certs from the root CAs, which is a lot easier. That’s going to be a flood of cert requests on the CAs.

    I really need to stress what a problem it is that you’re requiring certs for all internal web sites.

    And what about intranet sites in general? Have you guys developed a better method, of any kind, for distributing enterprise root certs around? Right now, I have to manually install them on every PC. Now you’re saying I have to do that no matter what.

    The short version is that the core problem with HTTPS right now is that it’s too popular. Making HTTPS mandatory will further degrade it’s utility and put serious and important uses of HTTPS, like financial transactions, in danger.

  20. Lestat wrote on

    For everyone interested in using a more reasonable browser… – Closed source but these devs do care what the user wants and needs and implement it – aka adding customization and options instead of removing them – Gecko without Mozilla’s crazy ideas and Chrome hunt down ideas

More comments: 1 2 3 4 5 7