Deprecating Non-Secure HTTP

Today we are announcing our intent to phase out non-secure HTTP.

There’s pretty broad agreement that HTTPS is the way forward for the web.  In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications, which in the case of the web means HTTPS.

After a robust discussion on our community mailing list, Mozilla is committing to focus new development efforts on the secure web, and start removing capabilities from the non-secure web.  There are two broad elements of this plan:

  1. Setting a date after which all new features will be available only to secure websites
  2. Gradually phasing out access to browser features for non-secure websites, especially features that pose risks to users’ security and privacy.

For the first of these steps, the community will need to agree on a date, and a definition for what features are considered “new”.  For example, one definition of “new” could be “features that cannot be polyfilled”.  That would allow things like CSS and other rendering features to still be used by insecure websites, since the page can draw effects on its own (e.g., using <canvas>).  But it would still restrict qualitatively new features, such as access to new hardware capabilities.

The second element of the plan will need to be driven by trade-offs between security and web compatibility.  Removing features from the non-secure web will likely cause some sites to break.  So we will have to monitor the degree of breakage and balance it with the security benefit.  We’re also already considering softer limitations that can be placed on features when used by non-secure sites.  For example, Firefox already prevents persistent permissions for camera and microphone access when invoked from a non-secure website.  There have also been some proposals to limit the scope of non-secure cookies.

It should be noted that this plan still allows for usage of the “http” URI scheme in legacy content. With HSTS and the upgrade-insecure-requests CSP attribute, the “http” scheme can be automatically translated to “https” by the browser, and thus run securely.

Since the goal of this effort is to send a message to the web developer community that they need to be secure, our work here will be most effective if coordinated across the web community.  We expect to be making some proposals to the W3C WebAppSec Working Group soon.

Thanks to the many people who participated in the mailing list discussion of this proposal.  Let’s get the web secured!

Richard Barnes, Firefox Security Lead

Update (2015-05-01): Since there are some common threads in the comments, we’ve put together a FAQ document with thoughts on free certificates, self-signed certificates, and more.

288 responses

  1. FF Extension Guru wrote on :

    I run around half dozen sites. None of the are eCommerce. They are simply blog and gallery sites. The only login is for admin purposes. I don’t self host these, they are on a shared server. I don’t get revenue for these sites so the way they are hosted now works well for me as this is a hobby, NOT a business. I suppose if this were to go forward the only functionality that would “break” would be the admin login. In which case I would have to just use another browser for that portion. Of course if this would prevent users from submitting comments, then I would have a big problem. Trying to convert to secure seems like over kill and would be costly and time consuming.

    I do run one other site which is an eCommerce site and they are on a different server with an SSL (which they really don’t technically need being all the payments are processed by PayPal, but we wanted their customers to feel secure and the SSL does encrypt the transmission of the order contents which includes their non-financial personal information).

  2. SteveP wrote on :

    Would be even better if it didn’t mean/imply that this now forces everyone running a site to pay for a cert. how about a free and quality cert provider to make it painless?

  3. Oscar wrote on :

    Cool idea. But it will be complicated for infra providers who provide video streaming services. Not all the streams need to be encrypted and broadcasting streams will be transmitted with another layer of encryption within HTTP, thus using HTTP as the transport protocol. This move will have an impact in these kinds of content providers. Have you considered?

  4. Keith Curtis ( wrote on :

    Encryption is valuable, but it isn’t necessary or interesting for a large number of sites. Take mine, for example. I have some free blog postings, some music, a link to my book, etc.

    It is all information I give away for free. Why would I encrypt it?

  5. none wrote on :

    Fab news with a sizable godzilla in the corner..

    In a world where https is as affordable for all as http, we would probably have https rules ok by now anyway.

    However, in the exploitative environment the web has developed into, server providers consider https either an added feature to be added for a fee, or already slap a higher cost for their services which include “free” or “default” https.

    By forcing rather than negotiating to move into https – mozilla & firefox are not helping some of the more financially challenged developers.

    ..and that is the opposite of Fab – how is it called?

    Cheers!

  6. Roy wrote on :

    Congratulations on sending a message to the web developer community that they need to feel warm happy feelings. The Internet has SSL on *both* of its web sites (Gmail and Facebook). There aren’t any other use cases for a web browser, are there? I’m glad that SSL is there to protect me against the army of evil hackers sniffing my ISP’s core routers.

  7. Evan wrote on :

    I’m glad of the decision, but what’s the story on browser support for DANE (that is, TLS certificates in the DNS, signed and validated using DNSSEC)?

    DNSSEC is easy nowadays (as long as you picked the right registrar; I voted with my feet on that point). I really have no desire to pay a CA, though. At the moment, I run my personal site over both http and https, using a self-signed cert; people can plow through the scary warning page and add an exception if they really want their traffic encrypted. But if I could pop a TLSA record into my zone and give people a green lock for free, I would be so very happy about that.

    1. tjeb wrote on :

      Yes please. TLSA still gives one the option to use a CA, but also allows the use of self-signed certificates. In addition, it can close the leap-of-faith that is still there with HSTS, *and* it signals to the browser that https can be used before an http request is made (which could then still be hijacked).

  8. Matthew Kercher wrote on :

    When did the open source community become so focused on restricting users instead of enabling freedom? I understand the need for security, but this is vastly overkill. I run my school radio’s website, should I encrypt that? For what purpose? So some scary man in the middle can’t see that someone really has school spirit? This is stupid and needless and will only serve to A) Make the internet less ready accessible to everyone and B) Reduce Firefox’s relevancy even further. When I’m testing PHP with Apache on localhost, should I have an SSL Cert for that? When I stop being lazy and finally slap up a resume website, should I pay for a cert for that? When developing software it’s important to value all use cases, and in this facet Mozilla put forth very little thought.

    1. Andrés Rodríguez wrote on :

      > Make the internet less ready accessible to everyone

      I’m sure Mozilla is going forward with this because of Let’s Encrypt.

      > When I’m testing PHP with Apache on localhost, should I have an SSL Cert for that?

      Localhost could be an exception by default, and on FF Developer Edition forced SSL might be disabled by default. It could be a single flag in about:config for all we know.

      1. Mike Simon wrote on :

        I looked at Let’s Encrypt. Seems really simple if you run linux, but what about Windows servers? Does it support wildcard domains? I guess it being free kind of obviates the need for Multi-domain certs, so that’s cool, but if it doesn’t do wildcards or work on Windows servers, some people will be left out.

  9. bosse wrote on :

    Hopefully this will be a fast transition, and not something that takes 2+ years, second hope would be for
    google/bing/duckduckgo etc to reduce the ranking significantly for http, and after a transition period remove unencrypted websites from showing with default settings, also remove weak ssl ciphers/options.

  10. Gustaaf wrote on :

    I think it’s way too early to make a move like this. A secure web is only possible when IPv6 is globally adopted. Worldwide only a few dozen ISPs support IPv6. There are simply not enough unique IP addresses for all sites to run HTTPS.

    You will (in the near future) stop introducing new features for HTTP and even disallow ‘insecure’ features. So no more javascript for HTTP-sites for example? I can hardly think of anything that will benefit from HTTPS apart from peer-verification.

    A whole lot of people have hobby sites, including myself that access a lot of state-of-the-art but possible insecure features. Do we need an SSL certificate too? Bullish. It’s not only the cost of the certificate itself, but also the separate hosting that will add to the CoO.

    Before moving in this direction I strongly recommend waiting for:
    1. A better HTTPS alternative (it’s too slow) and also insecure (according to Snowden the NSA can already crack most of the HTTPS-connections).
    2. Worldwide IPv6 adoption, not expected in the next 10 years. Although it should’ve happend 10 years ago, still only a single digit percentage[a] of all internet users can access IPv6 sites natively. There are simply not enough IPv4 addresses right now for every single website to have it’s own IPv4 address.

    a: https://www.google.com/intl/en/ipv6/statistics.html

  11. mk wrote on :

    This is a bad news

  12. Guest wrote on :

    Firefox still can’t handle self-signed certificates or certificates from cacert.org properly.

  13. Guest wrote on :

    I hope you at least allow people to switch the http mode back on in about:config. You can have a new setting like “protocol.http = false” by default but allow people to switch it back on if they want manually.

  14. Guest wrote on :

    Everyone prepare to see this message a lot for HTTP sites too LOL >> http://i.imgur.com/CPuchkJ.jpg

  15. G wrote on :

    Well there goes caching…

  16. Guests wrote on :

    Please read this “Please consider the impacts of banning HTTP” @ https://github.com/GSA/https/issues/107

  17. Lars Viklund wrote on :

    I hope that this initiative involves someone reworking the extremely broken and WONTFIX client certificate selection dialogue box.

    It’s bad enough for the few work infrastructure sites that require it, if I have to nag-OK every single site I go to, Firefox is dead.

  18. Nick wrote on :

    I’m completely against the idea of forcing people to use HTTPs for the latest features. Not all website need to be secure. Sites that involve collecting personal information, or that require login and passwords, I accept should be secure. But if a site is purely for advertising and nothing more what exactly is the purpose of this? Maybe Mozilla should find out how much support they have for this suggestion before trying to force everyone down this route.

    1. Zed wrote on :

      NO, man they’re proud to announce it’s their newest decision and they’re sticking to it whether no one else wants it or not because the government wants it!

  19. Shelikhoo wrote on :

    restriction might be:
    unless website is severing from local network,
    1. you can’t use a password input.
    2. you can’t set cookie
    3. javascript is disabled

  20. Mike Sirell wrote on :

    Please, PLEASE stop using the word “secure” when you mean “HTTPS”.

    Firstly “secure” is not a binary state.

    Secondly the idea that HTTPS web traffic cannot be intercepted, decrypted, modified, blocked,.. etc was a highly dubious claim ten years ago, and in 2015 it’s ludicrously, laughably wrong. To try to claim otherwise is extremely dangerous.

More comments: 1 2 3 4 7