Categories: Security

Deprecating Non-Secure HTTP

Today we are announcing our intent to phase out non-secure HTTP.

There’s pretty broad agreement that HTTPS is the way forward for the web.  In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications, which in the case of the web means HTTPS.

After a robust discussion on our community mailing list, Mozilla is committing to focus new development efforts on the secure web, and start removing capabilities from the non-secure web.  There are two broad elements of this plan:

  1. Setting a date after which all new features will be available only to secure websites
  2. Gradually phasing out access to browser features for non-secure websites, especially features that pose risks to users’ security and privacy.

For the first of these steps, the community will need to agree on a date, and a definition for what features are considered “new”.  For example, one definition of “new” could be “features that cannot be polyfilled”.  That would allow things like CSS and other rendering features to still be used by insecure websites, since the page can draw effects on its own (e.g., using <canvas>).  But it would still restrict qualitatively new features, such as access to new hardware capabilities.

The second element of the plan will need to be driven by trade-offs between security and web compatibility.  Removing features from the non-secure web will likely cause some sites to break.  So we will have to monitor the degree of breakage and balance it with the security benefit.  We’re also already considering softer limitations that can be placed on features when used by non-secure sites.  For example, Firefox already prevents persistent permissions for camera and microphone access when invoked from a non-secure website.  There have also been some proposals to limit the scope of non-secure cookies.

It should be noted that this plan still allows for usage of the “http” URI scheme in legacy content. With HSTS and the upgrade-insecure-requests CSP attribute, the “http” scheme can be automatically translated to “https” by the browser, and thus run securely.

Since the goal of this effort is to send a message to the web developer community that they need to be secure, our work here will be most effective if coordinated across the web community.  We expect to be making some proposals to the W3C WebAppSec Working Group soon.

Thanks to the many people who participated in the mailing list discussion of this proposal.  Let’s get the web secured!

Richard Barnes, Firefox Security Lead

Update (2015-05-01): Since there are some common threads in the comments, we’ve put together a FAQ document with thoughts on free certificates, self-signed certificates, and more.

288 comments on “Deprecating Non-Secure HTTP”

  1. david wrote on

    awesome news 🙂 I don’t except this to take place over the course of a year, but it’s a good start. And some people thought that the discussions about getting rid of http were a joke!

  2. Hamish wrote on

    > Gradually phasing out access to browser features for non-secure websites, especially features that pose risks to users’ security and privacy.

    Why “especially”? What is the motivation behind phasing out access to browser features that do not pose risks to security and privacy?

    1. rbarnes wrote on

      If you look closely enough, it can be hard to find features that don’t have security and privacy risks. Nobody thought was a privacy risk until people demonstrated canvas fingerprinting. It’s a question of degree.

  3. Peter wrote on

    Does this mean I can no longer launch a web site without paying for a cert? Or somewhat anonymously? Or likewise?

    I wouldn’t mind if http=self-signed cert, https=CA-signed cert, or similar.

    As is, half the kids in my dorm when I was an undergrad ran their own web servers, and maybe 5% of those turned into web startups.

    Making the web even more asymmetric seems like a Very Bad Idea.

    Good news is I suspect this might be a bit like IPV6 and never actually happen.

    1. rbarnes wrote on

      Nothing about this plan prevents you from using non-secure HTTP. It just means that over time, secure HTTPS is going to get more awesome, while non-secure HTTP is going to get less awesome. If the less-awesome web is good enough for you, you can keep on using non-secure HTTP. Though obviously the web would be better if you didn’t.

      1. Matthew wrote on

        You didn’t address the meat of his question. SSL certs are very expensive. This will be a factor in limiting speech on the web. The most frustrating thing about this https-only push is that the advocates absolutely ignore that the web was built on people having servers in their closets.

        Shouldn’t we fix the SSL cert probably until they are as cheap as domains first?

        1. J.R. wrote on

          “SSL certs are very expensive”

          The price of SSL certs falls somewhere between very cheap and free.

          “This will be a factor in limiting speech on the web”

          What absolute nonsense. The vast majority of people online “speak” through third party services. A single certificate can enable the speech of millions of users. A few dollars for a certificate isn’t enough to hamper competition.

          1. Dave wrote on

            A basic SSL cert from a decent CA costs anywhere from $25 to $50 per year.
            Multi-subdomain / widlcard SSL cert cost $300 or more per year.
            People outside USA and EU number in the billions.
            For most of them, USD costs are high and since CA chains originate from American root CAs most commonly, the high costs get passed down to third world users.
            Lots of kids purchase $20/year VPSes and start their first proper web presence. This entire layer of users will be screwed unless self-signed certs are given more weight or the cost of SSL certs is brought down drastically. Demand supply does not apply to this because the supply is restricted artificially. A CA-derived SSL cert might look good in a browser but says literally nothing about the business. No real verification except email and Credit Card number happens. So it’s only MITM that seems to prevented / protected against by the CA tree. There needs to a regulated price reduction, OR, an openID kind of verification (it’s the same level with a CA) during CSR processing, OR, there needs to be one good corporation that disrupts the CA extortion business model – like if Google or Mozilla were to start as a CA selling certs at $5 per year for email + CC verification.

            Or at least, in your replies henceforth on this topic, provide links to cheap SSL cert providers.
            Please help solve the problem for everyone, “works on my pc” doesn’t work on the internet.

            1. Zed wrote on

              Nuh uh man the websites will just be “less awesome” for those kids. What a joke, Richard.

              Bottom line is that http isn’t as easily accessible to the general public (WHO F***ING BUILT THE WEB AND MAKE IT WHAT IT IS) and this is going to limit our presence. I’m a web developer. I have 2 websites. One is a gallery of my own pictures and one is a personal blog. What about either of those things needs to be forced to be secure and should cost me an extra shit-ton (relative to my $10/year/domain domain registration and $10/year hosting)?

            2. Aranjedeath wrote on

              One need only check Startcom for a free certificate. Soon we’ll have let’s encrypt, as well.

          2. foljs wrote on

            The price of SSL certs falls somewhere between very cheap and free.

            Are you an American/Western European by any chance? If so, shut up, stop posting misleading BS, do some research and then talk.

            1. Gabriel wrote on

              Self-signed is free, though less awesome, there are other free options today and Let’s Encrypt is just around the corner. This move is going to push the demand for other free or trivially expensive options over time as well. Certificates fall somewhere between very cheap and free.

              Do some research!

        2. HybridAU wrote on

          SSL Certs have not been expensive for some time now.

          StartSSL offers free Class 1 certs and has done so for a few years now they are trusted by all browsers and good enough for 99% of sites. Then if you want Class 2 (for organizations rather than personal) or more SANs or wild cards, unlimited Class 2 certificates can be had for < $70.

          Then there is "Let’s Encrypt" we are yet to see how that plays out but it looks like it will make it very easy to get a free.

          1. Anon wrote on

            StartSSL is Israeli-based. No thanks! I’d prefer to be a few dollars rather than trusting them.

            1. Anon wrote on

              be -> pay

            2. Ben Hutchings wrote on

              Unless you use certificate pinning, your trust in the CA you choose for your web site is irrelevant. Any widely trusted CA, including StartSSL, GoDaddy, or the Dutch government, could issue a fake certificate for it.

            3. Ninveh wrote on

              Why would not you trust an Israeli CA? from a political point of view?
              If from a trust POV, keep in mind that Israeli cyber operations, as widely reported, are geared only against local middle east adversaries and against entities who might have info relevant to their ME adversaries. This is in contrast to the US, and its 5-eyes proxies, who consider the whole world as their adversaries.

          2. Dave wrote on

            > Then if you want Class 2 (for organizations rather than personal) or more SANs or wild cards, unlimited Class 2 certificates can be had for < $70.

            Links, please …?


            1. Adrian wrote on


        3. alex wrote on

          “Shouldn’t we fix the SSL cert probably until they are as cheap as domains first?”
          Establishing a CA that provides free certificates (like “Let’s Encrypt”) probably takes way less time than deprecating non-secure HTTP. So we should do both in parallel.

      2. Frank wrote on

        That is complete and utter b*ll and you know it. Nobody in his right mind would insist on using a secure site for a simple web presence that doesn’t present anything more than some info. While certificates may not be expensive anymore, secure HTTP still requires a fixed IP address. Guess what the often used shared hosting services don’t provide (unless at considerable extra cost). And don’t start about IPv6. IPv4 addresses are going to be necessary for a long time to come.

        This decision sucks.

        1. Graham wrote on

          @Frank: Almost all browsers (IE on Windows XP & old Android are the main exceptions) support SNI now so servers don’t need one IP per cert.

        2. Gerry Mander wrote on

          Amen brother. While we’re at it, I’d love for someone to come along and build a browser that only supports HTML and CSS. The web was much better before AJAX. So much of cliient side scripting is unnecessary and only makes it easier to spy on users while degarding the experience.

          1. James wrote on

            Why muck it all up with CSS?

        3. Owen wrote on

          Exactly, there are millions of sites that don’t have any need for https, what a typical lazy one-size-fits-all response to a problem. The legacy web for distributing information to users without logins and web “apps” is what the started the whole www thing in the first place and continues to be important, wake up mozilla you’re losing the plot.

          1. S. Albano wrote on

            If my cable company can inject JavaScript notifications into any unencrypted site (thanks Cable Company…), couldn’t a malicious script kiddie at the coffee shop inject an iframe to an attack site into your site while they were MITM attacking your user?

            Iframes, JavaScript, plugins can and are being injected into our unencrypted traffic for bad purposes. This is about data integrity and user security, not just privacy.

        4. kirb wrote on

          If you read the article, a simple web presence like you describe wouldn’t be affected. All it affects is some current issues regarding potentially private data (such as camera/microphone access) when sent unencrypted and limiting future features that are similarly problematic if unencrypted. Definitely not a one-size-fits-all; that would be too stupid and Mozilla would wake up the next day to find nobody uses their product any more. Any simple, informational, website probably wouldn’t even need JavaScript at all nowadays; if it does it surely wouldn’t need such features.

      3. Peter wrote on

        I spend a while in China. The Internet was awesome, especially if I went to Baidu.

        Of course, I could use Google, but it was a little less awesome. Slow load times and dropped packets. But that was probably for my own good. The Internet is better off without information about the Tiananmen Square Massacre. And China did a great job at making the open, distributed just a little less awesome.

        Thank you, Mozilla, for making the open, distributed Internet just a little less awesome in the US as well!

      4. Nicholas Steel wrote on

        Why are HTTP websites becoming less awesome? Your wording implies that there will be an active attempt to worsen HTTP instead of leaving it as is.

    2. Nando wrote on

      Why will IPv6 never actually happen?

      1. J.R. wrote on

        Because there’s no way to gradually transition. Everyone has to buy in simultaneously for a “switch over”, which will absolutely never happen.

        1. Oedipus wrote on

          Not a single part of that is true.

          IPv6 usage is rising on what sure looks to be a standard sigmoid growth curve. It is normally deployed gradually and compatibly and does not require any sort of simultaneous switchover.

        2. Alex wrote on

          Sure you can have a gradual switch over, there’s the whole point of dual stack networks and happy eyeballs in client applications and operating systems.

          I’ve been running with a native ISP provided IPv6 connection for a couple of years, it’s completely transparent.

    3. Simplebeian wrote on

      Or you know… StartSSL.

    4. Dan wrote on

      I agree. Forcing everyone to by a $995 SSL certificate will do nothing but ruin the internet for small companies – tilting the playing field more in the favor of large corporations once again.
      Mozilla has sold its soul…

      1. Neal wrote on

        Google for free ssl certificates

        About 9,480,000 results (0.49 seconds)

        So where are you spending that $995 cause I’ve got a real nice bridge I’m looking to sell

      2. Simplebeian wrote on

        Who is paying $995 for a certificate?

    5. mathew wrote on

      Actually, that’s a great idea — roll out new Mozilla features that only work on IPv6. That’ll force adoption, right?

      1. Grad wrote on

        Not really, it’ll force people to use another browser. If Firefox had a market share of 80% it’d work, but with the current marketshare? Not quite.

    6. Grad wrote on

      If you worry about those 9 bucks a year for the cert, check out Let’s Encrypt which will give you free certs…

      1. Anon wrote on

        This ^

  4. Peterr wrote on

    Wow! This is fantastic!

    I’m looking forward to having all web sites require signed certificates! I will be much more secure! The great thing is, if there is a scammy web site, or even one which is not politically acceptable, the government can just have the CA revoke the certificate! And I’m safe from content I shouldn’t be reading.

    (footnote: My previous comment was negative. Fortunately, moderated it away! Thank you for protecting me from having posted something perhaps foolish before!)

    1. cxqn wrote on

      Yes! Hooray! Please, Mozilla, be aware of the flaw in the CA system.

    2. NoneWhatsoever wrote on

      What utter nonsense. If a government doesn’t want you to read something on the Internet, then assuming they have any kind of jurisdiction over the site or its CA, then they can just, you know, shut down or seize the website.

      You don’t honestly expect anyone with a brain to believe that requiring certs introduces some kind of new avenue for government censorship that didn’t before, do you? Pure idiocy.

  5. Kise wrote on

    Could you give example of those features that will be disallowed on http?

    Also keep in mind that there is millions of websites that does not deal with your information such as my own blog where i write blog once in like a month or so, requiring SSL for such simple blog is overkill and overhead for no reason other then we say so.

    Also is Mozilla whiling to give free SSL certs + ips to all those websites on the internet? not all browsers support SNI to allow multiple certs on same IP, most server providers doesn’t give more than 1-2 IPs per server. not to mention hosting companies hosting thousands of websites on single IP.

    1. rbarnes wrote on

      Some things that have been discussed include geolocation and getUserMedia (microphone and camera access).

      Mozilla will not be giving away free certificates, but we are very supportive of Let’s Encrypt and other projects to make certificates more widely available. We can’t do anything to make IP addresses easier to get, though we fully support IPv6 and SNI.

  6. wowaname wrote on

    There are places where HTTPS is overkill, especially with the obvious cases such as localhost / LAN websites, and in cases where end-to-end encryption is already present such as Tor hidden services and I2P eepsites. Also, there is the broad category of all those old static websites that are left up for reference and haven’t done anything to update their content or servers for years, so I don’t see HTTP ever completely phasing out. HTTPS is good for the dynamic sites that we trust to keep our information secure, but saying it is the only option is unrealistic.

  7. Peter wrote on

    Deprecate all the things, before there is a sane* alternative.. Great plan!

    (*not sane: paying every 2 years 400$ to an ominous security company for a lousy cert)

    1. Simplebeian wrote on

      $400… the hell you buying your certs from?

  8. Whitney wrote on

    I sincerely hope this is strictly enforced. My router’s web interface uses weak encryption that prevents it from being viewed over HTTPS. Having had disabled HTTP I could not find a single browser that would allow me to load the page. I couldn’t even force the browser to let me load the page. I had to telnet onto the router to reenable HTTP to get to the configuration page. If we make browsers HTTPS-strong only all those legacy device-configuration pages will become inaccessible.

    1. Sebastian Jensen wrote on

      Most likely, anything intranet will get excluded from these limitations. Anything else seems illogical.

      1. Richard B wrote on

        I certainly hope intranet addresses are excluded (localhost, 192.168/16 10.*/24 etc) as there are also lots of long running applications that have mini http webservers to serve up current monitoring data. They specifically used http becuase it was easy to implement in code and consume in different ways. https will force shims like stunnel to be placed inbetween. These apps typically only accept connections from same subnet and are never seen on internet at large.

  9. James wrote on

    Not all data needs to be secure. Not all websites need to be secure. Requiring HTTPS means additional compute and additional servers securing something may not need to be secured and provides no benefit – only cost. Free and open information should be (optionally) free of encryption as well.

    And if other browsers don’t follow suit you’ll be painting yourself into corner with being intentionally incompatible with non-https sites.

    BTW- in case you care I’m a donator to Mozilla because of FireFox, but this type of move could drive me back to one of the big 3.

    1. Jipp wrote on

      Sorry, but encryption is *not* computationally expensive.

      “In January this year (2010), Gmail switched to using HTTPS for everything by default. Previously it had been introduced as an option, but now all of our users use HTTPS to secure their email between their browsers and Google, all the time. In order to do this we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that.”

      In the same series:

      1. Bill A. wrote on

        That’s because SSL sessions are cached, so the cost of initial key exchange is amortized across dozens, hundreds, or even thousands of connections. And the largest CDNs like Google have custom SSL stacks that permit sharing sessions across physical servers.

        Whereas for small sites where each viewer only visits a few pages every once in awhile, the connection cost will be very significant, even if you run multiple servers.

        Builtin hardware AES in modern Intel and AMD cpus makes the cost of bulk encryption negligible. But key exchange was and still is costly compared to unencrypted connections. Elliptic curve keys reduce the cost considerably, but only in comparison to RSA key negotiation. You’re still talking fractional connection throughput relative to unencrypted connections.

        HTTP/2 multiplexing might help, but given how poorly HTTP 1.1 pipelining has been supported by webapp stacks, I think it would be foolhardy to rely on it to save the day.

        That said, there’s plenty of fat to trim in various webapp software stacks. Even though in absolute terms SSL is _still_ expensive, I don’t think it’ll prove to be a big deal. I’m more concerned about the Certificate Authority racket.

        1. Andy Green wrote on

          “Whereas for small sites where each viewer only visits a few pages every once in awhile, the connection cost will be very significant”

          Yeah. But as ‘small sites… each viewer only visits a few pages every once in a while’, how significant can that be? They are small sites, and each user doesn’t do much… it’s not a problem then…

          1. gggeek wrote on

            But many small sites are hosted on a single server. Since the server can not reuse the ssl connections of userA->siteX for userB->siteY, it will take considerably more load

  10. Andy wrote on

    For those commenting about the cost of certificates – note that the price for most basic certs these days runs on the order of $10 to $20 per year, so not as exuberant as it used to be (and no more expensive than having pay yearly domain name registration fees). Furthermore, the EFF, Mozilla, and others are well at work building a completely free CA as part of the Lets’s Encrypt project: There are also CAs operating today through which one can obtain certs for free (e.g.,, etc). So while there are legitimate areas for concern in doing away with HTTP, I don’t really think cert cost is one of them.

    1. Kise wrote on

      while there is cheap certs < $20 it not so much cheap when you consider there is a lot of hosting companies hosting thousands of websites, and SNI is not widely available yet at least on old android phones, and when we are talking about IPv4 there even less IPs then websites. unless this is solved i foresee the same thing happening when Mozilla backtracked on not supporting H264. after losing huge market share to the likes of Chrome.

      1. Andy wrote on

        I think SNI is more widely available than you suggest. I run a number of SNI-based websites and have for years now with no use complaints. Unless you’re visitors are using WIndows XP (a significant security problem in and of itself) or Android 2 or earlier (now over 4 years old), I don’t think any siginifcant portion of web traffic is still SNI-incomaptible: And given that a large swath of the secure web is already unavailable to such individuals, claiming that we should avoid rolling out additional security features to support the few percent (or less) of users who can’t use them seems a bit of a stretch. And peopel without SNI support are only going to become rarer over the next few years as this initiative progress.

        1. gggeek wrote on

          SNI is still to happen for the internet-of-things. Lots of non-browser http clients have much simpler/outdated networking stacks (case in point: I justt spent 3 days battling Jira, flagship app from atlassian, which does not support SNI for its http calls, even when running on java 8…)

      2. alex wrote on

        Isn’t the stock Android (version < 3) browser already the only remaining relevant user agent that doesn't support SNI? So until this change is completed (i.e. at least several years in the future), there will be only a really small number of users of such old browsers left.

  11. Iain R. Learmonth wrote on

    What’s going to happen when the content is local? Are you going to have to run a webserver with HTTPS in order to do web development now?

    There are times when you do require not using encryption. You’ve missed a large part of the point of HTTPS, which in this case is the part that seems to apply. HTTPS provides an authentication mechanism and yes, I can see how this is useful to protect people from malicious code. Now a cracker will have to go out and spend £5.99 on an SSL certificate for his malware to work. But the use of enforced encryption has negative consequences in some cases and browsers should be flexible in this regard.

    In the case of amateur radio, the use of encryption is forbidden by the license conditions in every country I know of for the most part (there are exceptions for example when supporting a service with personal information involved). Is Mozilla saying that because you’ve decided to jump on a bandwagon, we’re going to have to go and find another browser?

    In the case of network hardware (and I’m guessing other hardware) the web interfaces can be quite dreadful and often will have poor SSL implementations. I’ve already had problems with being able to access switches to reconfigure them. Currently I just firewall these off and make sure the interfaces are only available from select machines. Am I now going to have to set aside another machine to run an older version of Firefox to manage these switches too?

    In the case of Internet engineering, especially in the development of these new protocols, it can be easier to see how things are working, performing packet captures, etc. when encryption is not in use. Mandatory SSL would mean extra steps in debugging experiments and this extra work could be avoided. (Of course, I’m aware that testing with the encryption is also necessary, but one of the advantages of an open source project is that you can take a white box approach).

    I agree that for the most part encryption is a good thing, and that for most service providers, they should have mandatory SSL to protect connections to their services, but Firefox is running on MY computer. Why should Firefox be artificially limiting what I can and cannot do, not based on any technical limitations, seems a ridiculous step for an open source project to be taking. There are times when communications are deliberately not secure, there is no way to make them secure or they have been secured through other means.

  12. Zach wrote on

    I like this, prohibiting login forms from being submitted in a non-secure manner would be a great first step. And prohibiting HTTP POST requests on non-HTTPS altogether might be a good step too. Consider adding an in-browser banner above every webpage, letting the user know this page’s contents may have been altered in-transit, and that nothing on it can be authenticated.

    1. Kise wrote on

      Grats you just broke more then half of the web.

      1. RandomHacker wrote on

        No; half the web was already broken, it was just failing silently while it passed our credentials to passive adversaries. Mozilla is making it fail loudly, and good on them. We’re not going to gain an inch of security if we’re crushed under the weight of supporting every bad idea anyone has had for the past twenty years.

  13. Phil Rosentahl wrote on

    This is a very bad idea.

    Encryption carries large computational costs, and reduces performance by breaking sendfile on servers.

    There are many places where Encryption is valuable (eg: websites that handle private information), and there are also many places where Encryption is completely wasteful (eg: video streaming websites).

    Datacenters are already huge consumers of power, and will necessarily increase this power consumption for all of this unnecessary encryption.

    I sincerely hope that no other browser follows suit, and Mozilla realizes how bad of an idea this is.

    1. Jipp wrote on

      Sorry, but encryption is *not* computationally expensive.

      “In January this year (2010), Gmail switched to using HTTPS for everything by default. Previously it had been introduced as an option, but now all of our users use HTTPS to secure their email between their browsers and Google, all the time. In order to do this we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that.”

      In the same series:

      1. J.R. wrote on

        But it *does* break sendfile.

      2. J.R. wrote on

        Also, your gmail counter-example is silly. An email webapp is a few KiB of HTML and a few dozen small text files. It may add up to a lot with millions of users, but it’s still nothing at all compared to streaming video. Video streaming sites consume bandwidth that runs into high double digit figures for percentage of global bandwidth.

        1. alex wrote on

          YouTube delivers its video streams over HTTPS (at least the stuff in Firefox Network Monitor looks like they do). So it doesn’t seem to be impossible to do video streaming via HTTPS.

          1. Phil Rosenthal wrote on

            Google is also a multi-billion dollar company with enough cash available to deploy the 2x servers required for encrypting video streams.

            See this document:

            Even after all of this development, Netflix was unable to achieve better than 50% of HTTP performance when using HTTPS.

    2. passcod wrote on

      It has been known for years that HTTPS is not overly significant in terms of computational costs on servers. Estimates range at about 1% of server load on average, and anyway the largest performance penalty for HTTPS by far is the handshake process (because it adds requests on initial connection, and that uses the network… the network being orders of magnitude slower than your CPU, that’s where performance hits occur) which can be mitigated by various things, including using keep-alive… and that’s the default in HTTP/1.1.

      Encryption defeats not only purposeful attackers (so is useful for “private data”), but also many forms of censorship (and so is useful for just about anything else, including video sharing websites).

      I’m not quite sure what you mean by “Encryption breaks sendfile”.

      “I sincerely hope that no other browser follows suit” Actually, it was Chrome that started something like this, or at least it was Chrome that started deploying browsers with penalties (only visual at this point afaik, but probably getting more stringent as time goes on) for non-secure websites. So really, it’s Firefox that’s following suit.

      1. J.R. wrote on

        Who cares about “on average” if you break entire classes of use cases (e.g. streaming video). Talking about averages is cute, but it doesn’t tell the full story.

      2. Mildred wrote on

        > I’m not quite sure what you mean by “Encryption breaks sendfile”.

        This can also be described as sending the file over the TCP socket with no extra copies.

        Generally, you use the read(2) system call to read from a file, and copy to a local buffer. Then you use write(2) to write the buffer to the TCP socket. The extra copy here is not necessary and can be removed. Either by using sendfile(2), nmap(2) or splice(2). The logic is the same and involves no extra copy to a buffer. File is directly read from disk and sent to the network.

        With encryption, you can’t send the file straight to the network and it needs to be processed by the encryption layer. When the file is static and public, this is purely a waste of resources.

        Preventing modifying the resource can be done more efficiently by generating a digest hash of the content and signing it with a private key. The signature can be reused by the server.

        Note that in case there is a reverse proxy in the pipeline, the extra copy is there on the reverse proxy server. So no adverse effect should be noticed when switching from plain HTTP to HTTPS because of an extra copy. For small servers, this shouldn’t be the case.

        To sum up: this is great for server farms and big companies. This is great for authenticated traffic, but this isn’t great for unauthenticated and untracked traffic or small servers. This and the fact that I’ll never get a certificate from a CA because there is no CA out there that I trust.

    3. Nick Lewycky wrote on

      “There are many places where Encryption is valuable (eg: websites that handle private information), and there are also many places where Encryption is completely wasteful (eg: video streaming websites).”

      See this article “You Can Get Hacked Just By Watching This Cat video on YouTube”:

      The problem exists regardless of what the website is; all content served over non-SSL can be replaced by a man in the middle and is therefore an attack vector. All HTTP GET requests.

  14. James T James wrote on

    The Thirtieth day of April in the year of Our Lord Two-Thousand-And-Fifteen will go down in history as the day that the World Wide Web died.

    1. Gary L. L. wrote on

      No just the day Firefox died. The rest of the way I mean. It has been dieing since Australis was forced on us even though very few liked it.

  15. Adam Jacob Muller wrote on

    While this may be a well-intentioned move, there are far more pressing issues to security. What’s the point of forcing SSL on browsers if you’re not going to be careful about *what* CA’s you give carte-blanche to sign certificates to (see:

    Mozilla (and other browser vendors) also need seriously consider the computational cost that is going to be associated with forcing SSL encryption and in connection, the power and eventual environmental impact that this will have. I’m entirely serious when I say that this will have a completely measurable impact in the amount of CPU time that is required to serve SSL impact which translates directly to power usage and environmental impact.

    As other commentators have pointed out, this is going to break a plethora of things that are considered core to the internet. Want to run a personal website for your own consumption? Have to pay for a certificate. Want to run an anonymous site, no way, you have to verify your identity to get a certificate. Want to debug issues using common tools like tcpdump and packet captures? No way. Running a proxy for a secure institution or corporation that simply requires packet inspection (to ensure against data leakage) and thus must block HTTPS? Great, now your employees can’t even check the weather or traffic — excellent for morale.

    Even with HTTPs-everywhere people will still be able to see what sites you are browsing (either sniffing your DNS requests or sniffing SNI on your “secure” HTTPs-requests) even if they can’t see the actual content. If the content providers deemed that whatever you’re doing is security-sensitive or privacy-sensitive they already have the ability to make a decision to secure that information over HTTPS — or not.

    There are, also, far better security measures that firefox can take to ensure that site operators have the control and freedom to make their sites secure for everyone, for example, HTTPs pinning.

    This just seems to me to be another case of others trying to impose short-sighted goals on everyone (in a father knows best attitude) for very limited gains and will be highly detrimental to the internet as a whole.

    1. alex wrote on

      Most of the things you claim this breaks aren’t really true: If you “run a personal website for your own consumption” or “want to debug issues”, it’s trivial to generate self-signed certificate and import it (if needed temporarily) into your browser’s trust store.

      1. David Cantrell wrote on

        No, it’s not trivial. It requires arsing about on the command line, and then arsing about in obscure corners of your browser. And don’t forget the obscure corner’s of your phone’s browser as well if you want to use it from there.

        This makes it infeasible for normal people.

  16. Phil Rosenthal wrote on

    Note how Netflix admits that even after all improvements, there is still a 50% reduction in performance after introducing SSL. All so that they can re-encrypt the same videos over and over.

    Anyone who claims that there is no computation impact for high bandwidth static file serving is just flat out wrong.

  17. Ben Cooke wrote on

    What’s the plan to ensure that governments can’t compel CAs via secret courts to issue fraudulent certificates so they can execute MITM attacks?

    What’s the plan to fix the CA system so that one malicious/incompetent CA can’t compromise the whole system for everyone else?

    This change seems premature. There is no value in pushing people towards a system with such obvious flaws in it.

    1. J.R. wrote on

      This is like saying: “putting a lock on my door is all well and good, but what’s to stop someone prying it open with a crowbar?”.

      1. Ben Cooke wrote on

        I disagree. SSL is already deployed enough to protect the overt threats on my security: SSL is used to collect my credit card number and other such instruments.

        Privacy rather than security is the motivation for blanket SSL, but the government is the main collector and abuser of the cleartext metadata in question and the biggest threat to those for which privacy is a significant issue.

        Locks don’t afford privacy.

  18. brian wrote on

    > and the US Government calling for universal use of encryption

    Yeah I wonder why…. uhmmm maybe because X.509 is COMPLETELY BROKEN?! Are you serious Mozilla?? if you’re seriously going to do this then at least remove those self signed certs warnings

  19. 78 wrote on

    bad idea. some places in the world need to have locks, sure, but others specifically need to not have locks. diversity is essential for survival.

    1. hugo wrote on

      The lock-metaphor doesn’t really work here. The reason that certain places “specifically need to not have locks” is that they need to be open to a large or unspecified audience and that access to them needs to be as fast as possible. HTTPS doesn’t impede either.

      1. TimC wrote on

        You’ve obviously never used HTTPS before.

  20. Nate wrote on

    I hope this doesn’t just serve to reduce the usage of Firefox further.
    If other browsers don’t implement these same changes, then it will just be a case of it appearing that Firefox causes problems for users that other browsers do not.
    With Firefox usage dropping regularly, I’m not sure it is really in the position of forcing any sort of changes on anyone.

  21. FF Extension Guru wrote on

    I run around half dozen sites. None of the are eCommerce. They are simply blog and gallery sites. The only login is for admin purposes. I don’t self host these, they are on a shared server. I don’t get revenue for these sites so the way they are hosted now works well for me as this is a hobby, NOT a business. I suppose if this were to go forward the only functionality that would “break” would be the admin login. In which case I would have to just use another browser for that portion. Of course if this would prevent users from submitting comments, then I would have a big problem. Trying to convert to secure seems like over kill and would be costly and time consuming.

    I do run one other site which is an eCommerce site and they are on a different server with an SSL (which they really don’t technically need being all the payments are processed by PayPal, but we wanted their customers to feel secure and the SSL does encrypt the transmission of the order contents which includes their non-financial personal information).

  22. SteveP wrote on

    Would be even better if it didn’t mean/imply that this now forces everyone running a site to pay for a cert. how about a free and quality cert provider to make it painless?

  23. Oscar wrote on

    Cool idea. But it will be complicated for infra providers who provide video streaming services. Not all the streams need to be encrypted and broadcasting streams will be transmitted with another layer of encryption within HTTP, thus using HTTP as the transport protocol. This move will have an impact in these kinds of content providers. Have you considered?

  24. Keith Curtis ( wrote on

    Encryption is valuable, but it isn’t necessary or interesting for a large number of sites. Take mine, for example. I have some free blog postings, some music, a link to my book, etc.

    It is all information I give away for free. Why would I encrypt it?

  25. none wrote on

    Fab news with a sizable godzilla in the corner..

    In a world where https is as affordable for all as http, we would probably have https rules ok by now anyway.

    However, in the exploitative environment the web has developed into, server providers consider https either an added feature to be added for a fee, or already slap a higher cost for their services which include “free” or “default” https.

    By forcing rather than negotiating to move into https – mozilla & firefox are not helping some of the more financially challenged developers.

    ..and that is the opposite of Fab – how is it called?


  26. Roy wrote on

    Congratulations on sending a message to the web developer community that they need to feel warm happy feelings. The Internet has SSL on *both* of its web sites (Gmail and Facebook). There aren’t any other use cases for a web browser, are there? I’m glad that SSL is there to protect me against the army of evil hackers sniffing my ISP’s core routers.

  27. Evan wrote on

    I’m glad of the decision, but what’s the story on browser support for DANE (that is, TLS certificates in the DNS, signed and validated using DNSSEC)?

    DNSSEC is easy nowadays (as long as you picked the right registrar; I voted with my feet on that point). I really have no desire to pay a CA, though. At the moment, I run my personal site over both http and https, using a self-signed cert; people can plow through the scary warning page and add an exception if they really want their traffic encrypted. But if I could pop a TLSA record into my zone and give people a green lock for free, I would be so very happy about that.

    1. tjeb wrote on

      Yes please. TLSA still gives one the option to use a CA, but also allows the use of self-signed certificates. In addition, it can close the leap-of-faith that is still there with HSTS, *and* it signals to the browser that https can be used before an http request is made (which could then still be hijacked).

  28. Matthew Kercher wrote on

    When did the open source community become so focused on restricting users instead of enabling freedom? I understand the need for security, but this is vastly overkill. I run my school radio’s website, should I encrypt that? For what purpose? So some scary man in the middle can’t see that someone really has school spirit? This is stupid and needless and will only serve to A) Make the internet less ready accessible to everyone and B) Reduce Firefox’s relevancy even further. When I’m testing PHP with Apache on localhost, should I have an SSL Cert for that? When I stop being lazy and finally slap up a resume website, should I pay for a cert for that? When developing software it’s important to value all use cases, and in this facet Mozilla put forth very little thought.

    1. Andrés Rodríguez wrote on

      > Make the internet less ready accessible to everyone

      I’m sure Mozilla is going forward with this because of Let’s Encrypt.

      > When I’m testing PHP with Apache on localhost, should I have an SSL Cert for that?

      Localhost could be an exception by default, and on FF Developer Edition forced SSL might be disabled by default. It could be a single flag in about:config for all we know.

      1. Mike Simon wrote on

        I looked at Let’s Encrypt. Seems really simple if you run linux, but what about Windows servers? Does it support wildcard domains? I guess it being free kind of obviates the need for Multi-domain certs, so that’s cool, but if it doesn’t do wildcards or work on Windows servers, some people will be left out.

  29. bosse wrote on

    Hopefully this will be a fast transition, and not something that takes 2+ years, second hope would be for
    google/bing/duckduckgo etc to reduce the ranking significantly for http, and after a transition period remove unencrypted websites from showing with default settings, also remove weak ssl ciphers/options.

  30. Gustaaf wrote on

    I think it’s way too early to make a move like this. A secure web is only possible when IPv6 is globally adopted. Worldwide only a few dozen ISPs support IPv6. There are simply not enough unique IP addresses for all sites to run HTTPS.

    You will (in the near future) stop introducing new features for HTTP and even disallow ‘insecure’ features. So no more javascript for HTTP-sites for example? I can hardly think of anything that will benefit from HTTPS apart from peer-verification.

    A whole lot of people have hobby sites, including myself that access a lot of state-of-the-art but possible insecure features. Do we need an SSL certificate too? Bullish. It’s not only the cost of the certificate itself, but also the separate hosting that will add to the CoO.

    Before moving in this direction I strongly recommend waiting for:
    1. A better HTTPS alternative (it’s too slow) and also insecure (according to Snowden the NSA can already crack most of the HTTPS-connections).
    2. Worldwide IPv6 adoption, not expected in the next 10 years. Although it should’ve happend 10 years ago, still only a single digit percentage[a] of all internet users can access IPv6 sites natively. There are simply not enough IPv4 addresses right now for every single website to have it’s own IPv4 address.


  31. mk wrote on

    This is a bad news

  32. Guest wrote on

    Firefox still can’t handle self-signed certificates or certificates from properly.

  33. Guest wrote on

    I hope you at least allow people to switch the http mode back on in about:config. You can have a new setting like “protocol.http = false” by default but allow people to switch it back on if they want manually.

  34. Guest wrote on

    Everyone prepare to see this message a lot for HTTP sites too LOL >>

  35. G wrote on

    Well there goes caching…

  36. Guests wrote on

    Please read this “Please consider the impacts of banning HTTP” @

  37. Lars Viklund wrote on

    I hope that this initiative involves someone reworking the extremely broken and WONTFIX client certificate selection dialogue box.

    It’s bad enough for the few work infrastructure sites that require it, if I have to nag-OK every single site I go to, Firefox is dead.

  38. Nick wrote on

    I’m completely against the idea of forcing people to use HTTPs for the latest features. Not all website need to be secure. Sites that involve collecting personal information, or that require login and passwords, I accept should be secure. But if a site is purely for advertising and nothing more what exactly is the purpose of this? Maybe Mozilla should find out how much support they have for this suggestion before trying to force everyone down this route.

    1. Zed wrote on

      NO, man they’re proud to announce it’s their newest decision and they’re sticking to it whether no one else wants it or not because the government wants it!

  39. Shelikhoo wrote on

    restriction might be:
    unless website is severing from local network,
    1. you can’t use a password input.
    2. you can’t set cookie
    3. javascript is disabled

  40. Mike Sirell wrote on

    Please, PLEASE stop using the word “secure” when you mean “HTTPS”.

    Firstly “secure” is not a binary state.

    Secondly the idea that HTTPS web traffic cannot be intercepted, decrypted, modified, blocked,.. etc was a highly dubious claim ten years ago, and in 2015 it’s ludicrously, laughably wrong. To try to claim otherwise is extremely dangerous.

  41. rott wrote on

    are we going to be like in “Idiocracy” movie ? all the people in charge are dumb??

    – MS wanted to change the way people look at computers with win8 and they found out it was their way only and people decided to remain on w7
    – Mozilla wanted to join the “identical interface” (chrome, opera ) of a browser with Australis and they lost people ( i switched to pale moon and while i like chrome`s interface i loved the previous firefox also, but not the new one)

    now mozilla moves to cripple sites with their browser, enjoy your drop in users

    ps. MS had the advantage that there is no real OS that can compete with windows and the choices were stay with old w7 and or move the w8 so there was no real drop in users on “windows” but in browser area there are some very good choices

  42. roman wrote on

    Disabling features that used to work is a really bad idea.
    I recommend Firefox for various http intranet sites. Having some features there stop working because of a browser update would be extremely annoying.

    This will also have a chilling effect on small websites. Having to manually get a new cert every year and reading through pages of CA terms of service etc. (I do read them) is not something I’d put up with for every website.

    Even if we get services like letsencrypt or the like, this move is going to make the web a lot less open and decentralised.

    I’d at least expect Firefox to get a lot less hostile toward selfsigned certs.

  43. Adrian wrote on

    While I really love the idea of widespread encryption, I believe there are cases where it really doesn’t make much sense.

    For example, I own a website where I do nothing besides showing my portfolio of photos. The only interactive element is a contact form. Encryption doesn’t really add any advantages there.

    1. John Teague wrote on

      The advantage is in the utility of an overall safer web ecosystem, not about one site. Think vaccinations.

      1. Valerio Bozzolan wrote on

        IRL if you don’t want “vaccinations” you are not eliminated by the doctor ._.

  44. Unary Negation Operator wrote on

    When I read that you want to enforce HTTPS, I thought: “Oh the governments would be happy about it”, and next thing I see, you admit to being seduced into this decision by the governments.

    I’m sorry Mozilla, it seems like you forgot your purpose and origins.

    Did you mix the dates maybe? This was supposed to be published on April 1st?

  45. Unary Negation Operator wrote on

    And if anyone has doubts: the next step will be to enforce USERS to have certificates, that uniquely identify them. Because security, wink wink!

  46. QJ wrote on

    I definitely advocate this intent.

    Some comments say “I don’t need HTTPS because my site is non-commercial and even without log-in page”.
    But I should point out HTTPS brings other benefit besides security.

    For example, if I am a MITM, I would be happy to insert js scripts to pop-up an AD on your HTTP website, or substitute my Baidu AD for your Google AD.
    Also, I can easily redirect visitors from your site to other sites with my aff link, making visitors believe it’s your site’s fault.
    These are very common in all ISP in China. HTTPS mitigates such behaviors.

    1. Zed wrote on

      What if it’s literally a sandbox for experimenting as a developer? (Read: Need access to features) Plus my own blog where I write text and have no ads or images?

    2. brian wrote on

      Yeah, and that’s why I use NoScript. Also, MITM is not the only way to inject javascript into websites.

  47. Toady wrote on

    Interesting idea but how does this affect things like Google Adsense as the code is served under HTTP witch gets blocked under HTTPS as mixed content, Does this mean that Google will now add support for there ads to show on secure HTTPS page?

    If so how does this far against the whole security concept with depreciating HTTP is Google phasing out there adsense?

    How will ads on secure pages fare will they become the perfect proxy to steal data entered on secure pages?

    1. Zed wrote on

      Good question!

    2. Daniel Veditz wrote on

      I found a reference to the following help page in a 2011 blog comment, AdSense has apparently supported HTTPS for quite a long time:

  48. Justin C. wrote on

    Reading the comments below, and my answer to nearly everyone is “what a bunch of FUD.”

    I pay MORE for my domain name (~$9 US/year) than I pay for a cert. I can easily get a COMODO single-domain cert for ~$6 US per year, depending on how long the cert lasts. With “Let’s Encrypt” becoming a thing this year, even that argument doesn’t hold water.

    The tin foil hat-wearing people talking about the government revoking your certificate because you’re being critical or whatever (especially the comparison to China) are absurd. The government isn’t involved in your certificates, nor are they involved with the certificate authority (unless you’re using one of the US Department of Defense CAs, in which case… Yeah, you WORK for the government.)

    Get real, people. You’re acting like every website out there that needs to be secure is currently secure, which is obviously not the case. The only way to properly secure the sites that need security is to enforce security through the whole protocol, which is what this is aimed to do.

  49. Nick wrote on

    I understand it has it’s merits but it’s not a one fit suits all situation. It’s up to the owners of the website(s) to make that decision not for it to be forced upon people. Does this now mean that if I want to build a website within my own development environment I need to have HTTPs certificates setup test all it’s features before it’s launched?

  50. Fabio Muzzi wrote on

    You are 30 days late. April’s fool day was last month. Please be serious.

  51. ziogianni wrote on

    Have you considered that someone can re-compile from the source ? So that http can “reborn”? 🙂

  52. Gilberto Persico wrote on

    Are you nuts ? April’s fool was a month ago, please focus on fixing things, not on government or multinational requests, if you are still an open source foundation.

    1. Giuseppe wrote on

      I honestly don’t see the point here. I think a malicious javascript can run either with or without a certificate. You can even have a certificate for free, so I reaaaally don’t see the point here.


      1. Anonymous wrote on

        Without HTTPS it is much easier for a man-in-the-middle to inject malicious JavaScript in pages of innocent websites they don’t control.

        1. Erkki Seppälä wrote on

          And what is that malicious JavaScript supposed to do? From the user’s point of view, anything they receive from a maliciously MitMed HTTP site could just as easily received from a perfectly HTTPS-secured site that just provides the same content by itself.

          It is simply a matter of considering what kind of data one chooses to send to a site, encrypted or not. If you expect confidentiality, you may choose to use encryption.

          Related to this is the China firewall attack on Github. While it was unfortunate, I don’t think in the end it was something HTTPS would have saved. After all, the China can simply require that everyone uses either their HTTPS certs or they don’t use any HTTPS certs.

          1. Christian Parpart wrote on

            Erkki, it is about saving the end-users privacy.
            Also, a bad javascript can fully modify your complete webpage at own will. Cheers.

          2. Matthew wrote on

            Maybe it would not have stopped China, which can use state power to coerce certification authorities. We also know that NSA has been able to insert itself into certification authorities as well. However, most attacks aren’t states, they’re mostly teenagers just screwing up sites for the fun of it. Universal HTTPS would be pretty effective against that.

          3. Josh wrote on

            SSL/TLS also provides a way to authenticate a site as being controlled by the people the CA validated. MiTM in a world where everything is SSL requires some trickery such as squatting on similar misspellings of domain names and using certs made of those, or using certs from self-signed CAs or other CA’s that aren’t usually in the web browser to begin with. All this together makes it a lot harder to MiTM a site that’s using an SSL cert from a reputable CA. It’s not just about encryption.

      2. Robert P wrote on

        It is so freakin easy to MITM connections though. It makes it much more difficult to MITM useful sites when they have HSTS, upgrade CSP attribute, public key pinning, and secure cipher suites.

    2. Benjamin Smith wrote on

      Honestly, rather than actually deprecate http sites, why not pick a *sane* warning system?

      Any site that isn’t encrypted or refers to resources that aren’t encrypted SHOULD HAVE AN ANGRY RED ICON on the address bar.

      Any site that is encrypted but isn’t perfect should have a YELLOW icon because you have some protection. EG: 1024 bit key, stale or self-signed certificate, etc.

      You should see LIGHT GREEN for the “basic” SSL sites.

      You should see DARK GREEN for the “security enhanced” SSL sites.

      The current warning system is lame – it indicates that all is well when you are the *least secure*.

      1. Daniel Veditz wrote on

        In parallel to this effort, or indeed as part of it, we are working on redesigns for the site indicators. As Richard said in the title, “deprecating” insecure HTTP. Part of deprecation is giving people warnings when they’re using a feature you’re trying to phase out.

      2. Cody wrote on

        SSL certificates ARE the warning system in HTTPS…

    3. Ted Kraan wrote on

      I am still looking for the 1st of April joke/reference.

      HTTP was never invented to be secure. Analogies would be ‘Let’s ditch forks because you can’t eat soup with them’ or car analogy ‘Let’s ditch cars because they can’t fly’.

      And the movement is all wrong too. Where is this free, open and transparant web we used to have? In the future the internet will be controlled by a few mega-corporations, which will feed it grey uncreative uninspired junk.

      Why not attack the root of the evil. The Javascript/ActiveX/Actionscript engines that let a hacker on the other side of the world install malware, because you navigated to a wrong site? Why was that never addressed? More curiously, who invented that stuff? Push installations from a remote host through a browser? What were they smoking when they thought of that?

  53. Lestat wrote on

    What about hobby projects which do only offer static HTML pages? This is a whole discrimination of small webprojects which either see no reason moving on towards HTTPS or have neither time or the necessary knowledge.

    What you propose is a discrimination against simple Webpages which are no danger. You lose control over reason more and more.

    Switching to another browser now, good bye!

    1. Brian wrote on

      All of the service tiers that Microsoft Azure offers for websites, including the free tier, have had an HTTPS endpoint for years. It’s transparent to the individual website and requires zero effort or change on the part of the web developer; you get a Microsoft certificate. The only work you need to do is if you decide to have your own domain name.

      If Microsoft can do it, every hosting service can too.

      1. Kirrus wrote on

        Microsoft have a CA cert. They can automatically sign certificates. Mom & Pop hosting don’t, and can’t.

        This is not a good thing for small hosts, nor small webmasters.

        1. Cody wrote on

          Hi Kirrus,
          You’re wrong. Mom and Pop can generate SSL certificates for free with StartSSL after a 50$ verification… If they can’t afford 50$, they probably shouldn’t be running a business.

          1. Cobab wrote on

            When I was younger, there were people on the internet having webpages while not running any business. It’s almost over, and I see Mozilla want to finish the job and kick us all out.

        2. Ron E. wrote on

          Cloudflare also allows free SSL certs. Learning to properly setup SSL takes maybe a few hours if doing it manually

      2. Zor wrote on

        This is not the answer I expect from Mozilla. Thanks for pushing Big Corp “Cloud” Providers agenda screwing over small providers. Nice way to defend the “open” web.

        1. Daniel Veditz wrote on

          We are pushing for secure communication on many fronts. Richard is describing eventual plans for the browser. At the same time Mozilla is supporting the nascient LetsEncrypt effort to provide easy-to-manage free certificates to domain owners. All the parts have to work hand-in-hand before we go from the “intent” described in this post to shipping.

    2. kobaltz wrote on

      Use CloudFlare with their Free Tier. You get free SSL support without having to do anything. As far as the browsers are concerned, it is served over SSL. The connection between CloudFlare and your Web Server would not be encrypted though.

      1. LvH wrote on

        And CloudFlare can then monitor all the traffic. 😉
        Any site using SSL behind CF, including those that chose Full-SSL and thus provided their private key, can have the data encrypted by CF constantly as a MITM. So how exactly is that more secure than no SSL at all? Ok, it gets harder to sniff packets on the local network. But for really sensitive data, it serves no point.

        Just as much that SSL actually serves no point at all for static HTML websites. Why on earth would you want to encrypt that? Complete waste of resources.

        1. Daniel Veditz wrote on

          You can’t have it both ways! Yes, CF’s free tier is inappropriate for “really sensitive data”, but so is a completely unencrypted connection which was the subject of this article. Short of that type of data–for which you need a real certificate for regardless of Mozilla’s plans to deprecate http–then “harder to sniff packets on the local network” is a big win.

        2. Pavel wrote on

          There are lots of wifi hotspots that inject or replace ads in every HTML page requested by clients. And there are ISPs that sniff and sell user traffic data (URLs visited and cookies set by advertisers). And malware, of course. This is not a teoretical nor a spy movie threat; automated HTTP interception is a lucrative business nowadays.

    3. James Patrick wrote on

      The Lets Encrypt project should take care of both of these aspects.

  54. Patrick Lambert wrote on

    What about the fact that SSL requires a unique IP per site, or the ability to support SNI, which requires a specific set of OSes? For example, Windows Server 2008 R2 cannot use SNI, and that is (and is likely to remain for several years) the most popular enterprise server in the world.

    1. Ed Burnett wrote on

      The vast majority of the web runs on Linux, BSD and variants. If a company makes the business decision to buy into proprietary software, then they should factor in all of the licensing, obsolescence, lock-in, and long upgrade cycle issues that go along with it.

      1. hron84 wrote on

        Yep, but you talking about new servers. What about old, already existing servers? Because if anyone buy a new server, it will be shipped with new os, however upgrading an existing server is a whole different thing, an it could depend on the existing infrastructure too. I think Mozilla should not make infrastructure decision for unknown companies. If there’s an old, Win2008 server, then there’s a cause for its existence.

        From other aspect: as long as Microsoft supports the existence of Windows 2008 R2, Mozilla should support it too, regardless about personal preferences.

        1. Ed Burnett wrote on

          We’ve reached a point on the political world stage where unencrypted traffic is a danger to human rights. I believe Mozilla is implicitly acknowledging that this is a higher priority than legacy systems in use by some institutions that may present some difficulty and expense in replacing.

          1. LvH wrote on

            How is it a danger to human rights when it concerns static public HTML? Seriously?
            From the packet headers you can still see what site and page is being visited, and no SSL encryption in the world is going to change that… You’d have to tunnel the traffic first to achieve any gain there.

            Surely, SSL is better for sensitive data. Absolutely.
            But it serves no purpose to force it everywhere… Not a single purpose at all.

            1. User wrote on

              That’s not how SSL/TLS works.

            2. Ed Burnett wrote on

              Actually, HTTPS packets do not reveal the requested URL/page. Only the server address.

              The benefits to privacy and security for all sites are many, and the technical impact is negligable. The NSA has well documented their interest in people who merely read certain technical articles. Not to mention by making a TLS connection you are assured that the data you are receiving is authentic, has been delivered by the verified owner of the domain, and has not been manipulated in transit.

            3. Gabriel Corona wrote on

              > Actually, HTTPS packets do not reveal the
              > requested URL/page.
              > Only the server address.

              And the server (vhost) name thanks to SNI.

              And the protocol thanks to ALPN.

  55. lozl wrote on

    1) then you should have ON/OFF big button for compatibility toggle
    2) make a way with ONE CLICK to allow self-signing certificates not current clicking nonsense
    3) @startssl PR of israel-based company: sftu

  56. Sven Slootweg wrote on

    My response to this, given that it’s a little too big to fit into a comment:

    1. Dianne Skoll wrote on

      Sven Slootweg’s article is excellent. This is not a good idea and doesn’t buy any actual security as long as browsers trust a whole bunch of CAs. It just takes one crooked or compromised CA to crash the whole house of cards.

    2. Daniel Veditz wrote on

      You’re right that a browser deprecating insecure HTTP isn’t sufficient to create a secure internet, but Mozilla (and others) are pushing on many fronts. Mozilla is tightening up requirements on our CA program, for example, and while there’s still room for improvement we have eliminated cryptographically weak root certificates, required stronger signatures (killing MD5 hashes and on track to kill SHA1; increasing the minimum key size), tightened the audit requirements on CAs (which has resulted in the de-listing of at least one CA), and so on.

      You’re wrong that certificate pinning is “still not actually implemented by major browsers.” It’s implemented in Firefox and Chrome which represents about half of browser usage. Users have a choice to use a more secure browser, and even if they don’t switch Firefox and Chrome users act as canaries detecting bad certs on popular sites and ultimately protecting users of all browsers (against general attacks; obviously a carefully targeted attack could bypass this).

  57. pyalot wrote on

    Are you fucking insane?

  58. Chris Star wrote on

    For a lot of static websites https literally does nothing except burden the webmaster, as a third party can still see which websites are visited.

    You shouldnt punish people like a local bakery for publishing a simple website where they show off their bread or a local archeologist who shows his findings online, for the greater good. The internet should welcome tech illiterate newcomers to share information online.

  59. rtechie wrote on

    As someone who does a lot of work with PKI, I think this is an extremely bad idea.

    Making HTTPS mandatory will seriously degrade the security of existing web sites.

    Right now, the main problems with SSL/TLS have to do with bad actions by root Certificate Authorities (like China’s CA) issuing inappropriate or questionable certificates.

    You’re assuming that site operators, and more importantly users, are going to use HTTPS intelligently and appropriately and that’s a bad assumption.

    Forcing every single site to use HTTPS means that unless that site has a root CA cert, users will get a browser error. And we’ve “trained” users to avoid sites with browser errors. This will create a “gold rush” with the root CAs as lots of smaller sites start requesting certs. This will inevitably lead to more bad certs being issued.

    And there will be a LOT more questionable certs issued.

    Because you intend to block features behind HTTPS, you’re making it impossible to TEST using HTTP, so every single internal, QA, or test site needs a cert. Sure, they can use self-signed, but users will get a browser error. So now either that organization has to run their own CA or get more certs from the root CAs, which is a lot easier. That’s going to be a flood of cert requests on the CAs.

    I really need to stress what a problem it is that you’re requiring certs for all internal web sites.

    And what about intranet sites in general? Have you guys developed a better method, of any kind, for distributing enterprise root certs around? Right now, I have to manually install them on every PC. Now you’re saying I have to do that no matter what.

    The short version is that the core problem with HTTPS right now is that it’s too popular. Making HTTPS mandatory will further degrade it’s utility and put serious and important uses of HTTPS, like financial transactions, in danger.

  60. Lestat wrote on

    For everyone interested in using a more reasonable browser… – Closed source but these devs do care what the user wants and needs and implement it – aka adding customization and options instead of removing them – Gecko without Mozilla’s crazy ideas and Chrome hunt down ideas

  61. Jamie E wrote on

    This is a welcome development. In the modern era, we need HTTPS more than ever to provide data integrity assurance. With nation-states now willing and able to do packet injection and MITM data manipulation on a truly staggering scale, HTTPS is our only defense. And yes, this goes for sites of any size or purpose. Consider the myriad of bad actors with Internet backbone access, who can make it appear to visitors that your site content says something completely different than it actually does. Web users are becoming unwitting victims of injected rogue JavaScript programs that weaponize their browser to attack other sites. The Great Cannon of China is only the first salvo in a new expansion of MITM attacks by the powerful against the weak.

    If you’re a webmaster, and concerned about costs and implementation details of HTTPS, please look at, and consider hosting with a CDN that offers your site HTTPS. I don’t want to mention anyone in particular, but these things are very easy to find. It’s worth it.

    1. Lestat wrote on

      Give me one reason why a guy with a plain and simple static Webpage should do all that extra work? Http is no crime.

      There is a difference between reason and discrimination. Mozilla and Google following the later path.

    2. Ben Cooke wrote on

      SSL (as currently deployed) does not defend against any nation state that contains a trusted CA.

      Governments compelling CAs to make fraudulent certs for MITM used to be theoretical, but it’s quickly becoming quite likely:

      SSL is already widely enough deployed to protect my credit card number, social security number and other such sensitive information. The push for universal SSL is motivated by increased *privacy*, and yet governments are not impeded by SSL in spite of them being the primary collector and abuser of network usage metadata.

      The right cause would be for a replacement for SSL and the CA infrastructure that defends against government interference. That must precede universal adoption of SSL, lest we mislead people into believing their privacy is protected when that is far from being the case.

  62. Lonnie.Severus wrote on

    Well, that all means i will probably switch to another browser.

    Will taking a closer look to Slimjet, Vivaldi or Opera. I will for sure stay no longer with Mozilla.

    Total insanity in control…..

  63. MJ wrote on

    The EFF posted an article last year announcing plans to make SSL open and free to everyone. It appears they are focusing on making it easy to work with certs. Not only are they providing a CA – but also easy to use management software to download/install certs for web sites.

    Read the EFF release here…

    “Launching in 2015: A Certificate Authority to Encrypt the Entire Web”

    Which links to the Let’s Encrypt website.
    Read the How it Works section.

    1. Nate wrote on

      Yes, we all know about letsencrypt.
      We know that it isn’t out yet.
      We don’t know if it will work on anything but the newest webservers.
      We don’t know what sort of requirements it will have in order to do its magic.

      If Firefox wants to be about forcing people to do things they don’t want to, then they will fail, as they have been so good at doing lately.

  64. Luaks wrote on

    Oh yes, i agree! Let’s send Mozilla guys into retirement! There is really only a small border between stupidity and brilliance.

    I hated Australis which was a clear shift away from classic Pro users towards simplicity loving ones

    I hated the new Chat feature because it was bloat.

    Let’s not start about the ads in the new tab page!

    Brendan Eich… Never forget!

    And now this!

    I agree so very much with this.

    Mozilla f*ck off!

  65. Denys Duvanov wrote on

    Hey, what about google analytics?
    Analytics doesn’t work without cookie.

    1. Daniel Veditz wrote on

      Google analytics works just fine over https.

      1. Roland Zink wrote on

        Which exactly the problem. The browser multicasts the visited URL to the big Internet companies. The NSA can get the URL this way too and they not even need to break TLS security to do so.

  66. Walter wrote on

    Shared hosting is what makes having a web site affordable to most people. AFAIK Apache is not able to lead with certificates for several virtual domains. You need a static IP for each one. So deprecating http you’re just encouraging users to use i.e. facebook instead of investing in having their own web site. Like by analogy is doing Red Hat with Pottering’s systemd you’re going in the opposite direction to “free” and “open”, you’re literally fucking users, freelance developers an SMEs in favor to multinationals.

    Dream is over.

    1. FF Extension Guru wrote on

      With shared hosting you need a multi-site cert (or UCC). My terminology maybe off here but the way it works is you have your primary site and under that you have secondary sites. I have never done this so I don’t exactly know how it looks, but been told that every site is listed on each site’s cert. My issue is my shared hosting provider sells these UCC’s in packs of 5. I have 6 sites (plus I sublet some space for 2 others sites to a friend of mine which are purely informational). This is going to be a major headache for me as would have to switch the primary domain from my personal site to my business site (no transactions take place, though I would like SSL for then the login into their account page). All the sub sites (including my personal and 2 of my friends) would then be listed under my business site when the cert is viewed. Then there is the process of applying and validating for an SSL on at least 6 sites. I would be looking at $250 a year for the certs on top of the $84 for the hosting and $100 for the domains (some of these sites resolve from several domains which I just realized I would either have to add those to the UCC or do 301 redirect). Again, the majority of what I do is hobby and my business site generates just enough revenue to cover the cost of the domains associated with it.

      1. Kirrus wrote on

        Multisite certs are not suitable for shared hosting. UCC certs are just for exchange servers.

        This is seriously shitting on small shared hosts.

    2. Travis wrote on

      Apache has supported SNI for several years now. If your shared host is still using an out-dated Apache version, it’s time to change hosts.

  67. John Vahn wrote on

    If you want to promote security making people actually safer rather than simply feeling that way start by redoubling your own efforts to fix defects in your own software which continues to endanger your users.

    Secondly provide users more options than just certificates based on global trust regimes. Support PAKE as an alternate means of establishing secure sessions where users already have more localized means of trust (e.g. A password)

    Finally stop confusing secure transport with trust bestowed by users. There is nothing you can safely do with a secure transport that you can’t already do with an insecure one. If anyone can obtain a certificate then anyone is able to take advantage of all browser features. Never confuse political goals with technical ones.

  68. A. Zander wrote on

    Since I work for a small hosting company I would like to know who will pay for the extra IP addresses we will be required to purchase because of this new policy? If Mozilla continues to push for this, I will either be laid off as the company works its way into shutting down, or they will start sending the Mozilla Foundation the bull for the additional IP Blocks we will be required to purchase to be able to continue offering web hosting.

    Which will it be? Mozilla, et al paying for our additional IPv4 usage, or Mozilla et al paying for my unemployment? Those are the 2 options I currently see in my future with this new push.

    Think on this as well… this “encryption” is not what most people think it is. Those here will know, but the general public assume that because it’s SSL that they data is encrypted all the time. THIS IS A LIE. I’ve tried to explain it to many people who are not tech savvy. Once they understand that it only encrypts the data from the server to their browser, and often nowhere else they start to wonder why we use SSL at all.

    Want to really make a difference? create a system that works in ALL email clients that will automatically encrypt and decrypt email between people without them having to do much more than putting in an authorization code from the sender. Add it to their address book system so they only have to put it in once for each email address.


    1. Nate wrote on

      With SNI (available on anything not ancient) you don’t need additional IP addresses, unless your software is also ancient.
      I suggest, rather than going out of business or losing your job, you update your servers.

      1. Kobor wrote on

        You do,since still a lot of users are coming from old browser, which don’t support SNI.

  69. John Snow wrote on

    Great, big changes. I am curious about impact on Google Page Rank for sites with HTTPS. There is also a problem with simple static sites which do not need SSL at all. Is there really a need to buy certificate only for maintain position in Google Search?

  70. Truth Teller wrote on

    Someone follow the money. Decisions like this are always because someone is getting paid indirectly to make it. Look forward to reading in the future who got rich.

  71. James wrote on

    Good luck….http is still going to be in use for a LONG time, and locking out your users is just going to push them to a different browser.

    1. Ed Hands wrote on


    2. Ed Burnett wrote on

      Nah, websites that refuse to utilize HTTPS will be the ones losing users. It’s trivial to implement and the benefits for all involved are many. Encrypting all web traffic has been in the discussion pipeline for a long time now, and I suspect Chrome and the other browser projects will soon follow suit.

      1. Suki wrote on

        Sorry but no, it may be trivial to you but it certainly isn’t trivial for the regular small company or the IT illiterate person who wants to start his/her own webpage.

  72. Jona wrote on

    The list of CAs coming with firefox includes numerous shady ones that have quite a history of failures. On the other hand: CACert’s inclusion has been denied for questionable reasons.

    I find it kind of funny that this is published by mozilla, proclaiming an “open and better web” while excluding the only CA that is truly open and non-profit.

    Presently, getting an SSL Certificate is a hassle that costs a lot of money if you actually want more than one subdomain to have a proper certifiate. I suggest you address that problem first, i.e. enable non-restricted and freely accessible SSL. Otherwise your “open web” will only be open to the ones that can afford it.

  73. Jonathan wrote on

    Bad decision all the way around, security sounds great on paper but for 99% of people does not matter at all.

    Main Issues:
    There is no easy way to install SSL for a common person. Involves multiple steps and then maintaining when there are issues with openssl, renewing the certificate etc, keeping up with the latest ssl exploits. This adds another barrier of entry for people getting into web development for a newbie.

    Going to destroy shared hosting environments that can’t support multiple ip’s and just for them to maintain all their end users.

    Not everything needs to be encrypted just making the internet inefficient and slower.

  74. Johan Boule wrote on


  75. Janet Merner wrote on

    Do not get rid of http but just make https the default and http the fallback. As someone else wrote mom and pop organizations that do not want to pay for the ssl certificates should not be forced to. Considering that most sites without https serve either static webpages or run everything on their own servers using Perl or php why should they be forced to adopt https.

    The Developers are also forgetting that in developing countries with low speed and sporadic quality internet services that the bandwidth and quality of service that is needed for a https connection will shut a lot of people out. The problem with the Mozilla foundation is you are based in the United States and are completely oblivious that not everyone in the world has available the resources that you do.

    There are children in some developing countries that are still using Pentium II computers over a 14.4 speed modem. Do yo want to deny them the possibility of using the Internet and developing skills that may be the difference between a life of poverty and one of fulfillment.

  76. Brian LePore wrote on

    “For example, Firefox already prevents persistent permissions for camera and microphone access when invoked from a non-secure website.”

    Maybe it’s just because my dev environment has a self-signed SSL, but I’ve been working on this for a new feature for a bit for us at work and Firefox ALWAYS prompts every time.

    And really, this is a crappy idea. I’m sorry, but not EVERYTHING on the web needs to be secure.

    1. Daniel Veditz wrote on

      “Maybe it’s just because my dev environment has a self-signed SSL, but I’ve been working on this for a new feature for a bit for us at work and Firefox ALWAYS prompts every time.”

      Have you clicked the tiny button with the tiny triangle on it? There should be an “Always Share” option there on secure sites. If you don’t dig in and find that then the main button is just “share one time only”. Our UX folks seem to hate multiple buttons, but personally I think it would make that (and several other) prompts clearer. Hiding non-default choices in a drop-down has nothing to do with the SSL/TLS topic here, it’s just design style.

  77. aaa wrote on

    I’m all for increasing security but the CA system is broken and we all know it

    I would rather see some real security improvements such as addressing fingerprinting using fonts . Per tab sandboxes and hardened browser builds. Remove rc4

    We as users do not want austalis , firefox hello, or anything else like that. We want a simple lean extensible browser which is secure.

    Each time I read the release notes for a new firefox version I am always disappointed by the lack of real security improvements. That is my only concern security . I’m not a chrome user as I feel google has enough control as it is but I am envious of the security of chromium.

    If you need revenue, partner with duckduckgo, startpage or disconnect search. I have been a loyal user for year but I am seriously becoming jaded with the efforts. It feels like we all know what needs to be done and yet nobody is willing to put in the effort to get us there.

    If you claim to be as open and for security as you suggest I would suggest allowing Mike Perry and the tor browser team to have much more of a say in how to further secure firefox.

    The modern browser is everywhere , in every device and business we need more security at the application level. The only hope for the network level is networks like Tor.

  78. Lestat wrote on

    – Anyway i look forward for seeing you Mozilla guys reducing your market share even further with this and all that actions.

    Your problem is you guys do NOT think about things.

    Let me list you most grave mistakes!

    1) You implemented Australis and removed functions because you are unwilling to maintain a separate codebase for Desktop usage, because you got jealous of Chrome’s user numbers and you hoped that enough Chrome users jump ship so you would win in the end! And even worse.. You did INTENTIONALLY turn your back towards Power Users, the ones which made you in the first place!

    2) You created Firefox OS because you got again jealous on Google’s success with Chrome OS

    3) You did let it happen that Brendan Eich got more humiliation as it was necessary at all, so he left on his own and your own hands have been clean.. more or less

    4) Signed add-ons… You saw again that move from Google and followed to have “Chrome parity”

    5) Instead of offering the user DRM free versions of Firefox you forced DRM on anyone

    6) The move against HTTP – Google made it first so you must again try to beat them in terms of being even more harsher in pushing trough with that rule!

    See what i mean? much mistakes you did and make or made is because you see Google as role model!

    What is so wrong in doing your own business again? Why you try to emulate the Chrome feeling as much as possible?


  79. Ed Hands wrote on

    Wow….this is about as poorly conceived an idea as they come. Honestly, I understand the concern and the idea and motivation behind it, but to the normal everyday user they do not see “Wow…Mozilla is looking out for me and my security! Way to go!” They see “gee…my cat videos don’t run when I use Firefox….let me switch to something that does.”

    The false assumptions that you are making is that a) people have a technical knowledge of internet security and are making rational decisions based on that knowledge and b) people have to browsers .

    In both cases, except for the crowd that has posted here, they don’t. People by and large don’t give a frog’s fat hiney about SSL certs and sha-1 vs sha-2 and such. They care that they can no longer visit their site. And they will switch browsers faster than you can say “Jack Robinson” to get to their site and make that their default.

    And ambitious, and goo intentioned, plan. But you know what they say about good intentions….

    Good luck with this.

    1. Ed Hands wrote on

      Correction to the above:

      b) people have loyalty to browsers .

      1. Omega wrote on

        The VAST majority of people are not loyal to browsers. They use the “blue E”. That’s all.

  80. Yuval Levy wrote on

    Your intention is laudable and the direction is the right one. However, please consider the following:

    (1) Encryption comes at a cost. It is time to end the taxation regime of certificate authorities (CA). Please hold off with your plans until is publicly and easily available.

    (2) The current trust model is broken beyond repair. I honestly prefer to trust self-signed certificates from responsible site owners that I know personally than certificates signed by some shoddy CA. The number of default root certificates in the different browsers and devices is mind-staggering, and when investigating them I would like to revoke trust to the majority of them. I no longer trust curated collections of root certificates, whether they come from Microsoft, Mozilla, Google, Apple, or any other distributor. Please give me control over the root certificates that come with Firefox: disable them all by default and prompt me on a case by case basis, when I access a secured website, whether I want to trust certificates that are signed by that specific root CA.

    When the cost of encryption to website publishers will be merely computing cost; and when the problematic blind trust in root certificate and mistrust in self-signed certificate are solved; then you will have my full support to deprecate HTTP.

    Until then, restricting artificially the availability of new features to encrypted sites only is a counter-productive publicity stunt and will only drive users away from Firefox.

    1. Yuval Levy wrote on

      The URL of let’s encrypt dot org has been filtered out in my point (1). It should read:

      (1) Encryption comes at a cost. It is time to end the taxation regime of certificate authorities (CA). Please hold off with your plans until LETSENCRYPT DOT ORG is publicly and easily available.

  81. Graham wrote on

    Good thing I stopped using Firefox a year and a half ago. To those of you who still deal with Mozilla’s crap… Well, have fun!

  82. Norman wrote on

    riding a bicycle is dangerous and exhausting, so lets deprecate it!!

  83. Jon wrote on

    Does that mean HTTP will stand for: Hyper Text Transfer Phoenix?

  84. evan wrote on

    The Internet is not worth the trouble. I disconnected my home service and only go to coffee shops occasionally for service now. Mozilla Firefox seems to have introduced the same scheme that Microsoft uses to monitor and track it’s servants, that of creating an intentional error , reporting to you that there is an error and then forcing you to fix the error. Do-Loop …

    I am going to switch from Linux Mint 17 to Blag , Blag features a version of Firefox where “automatic updates” = (Taking control of your system) is disabled.

    GNU-Linux is the only way to go.

  85. Mark wrote on

    This is over-the-top, whacked out techno-evangelism. This is weird and paternalistic and unnecessary. Raising the bar for casual web content on the best medium the planet has for free expression is just not a thing that needs to be done. I wonder what agenda there is here but I think it’s just myopic nerdy stupidity.

    I will happily allow my casual web projects to break in Firefox requiring people to switch to some sane browser whose dev team doesn’t feel the need to inject weird paternalistic nonsense into their architecture. So long, Firefox.

  86. Wat wrote on

    Wow. I had no idea Mozilla was a large enough organization so off-base. This will be an IT support nightmare when all these sites with self-signed certs pop up a dozen warnings that make users think they are being hacked. Are you actually trying to follow Opera off the cliff?

  87. Vasili wrote on

    Requiring HTTPS is like wearing a thermal jacket at all times, even when it’s warm.

  88. Locke Cole wrote on

    Mozilla’s gone off the rails, must be time to fork Firefox and work on a version that remains true to the goals and designs it was originally intended to meet. Making things “not work” is not the answer. Forcing site operators into additional fees (mandatory static IP address for using an SSL certificate, the cost of the certificate, etc) is ridiculous. Especially for folks that run casual/fun websites for personal or friends use.

    I get that the internet at-large needs more security for some things, but pulling the plug on something that isn’t broken to try and force people into something they aren’t asking for is almost the definition of bad business.

    1. Kyhwana wrote on

      Additional IPv4’s aren’t required. This is what SNI is for.
      There are also free certificates available (including hopefully soon

    2. Anees Iqbal wrote on

      Did you know that CloudFlare offers FREE SSL?! You don’t need a dedicated IP or something. All you need to do is add your website to cloudflare. and Et Viola!

      1. lozl wrote on

        >> C*Flare offers FREE SSL

        Then there will be only encryption between user and cloudflare.
        And from cloudflare to your site there will be none.
        Nsa will get everything in plaintext.
        Cloudflare is MITM.
        You are from marketing division and dont know what you are talking about.

  89. Catman wrote on

    So Mozilla has gotten into bed with NSA?

    Time to remove ‘Firefox’ and get a browser that works for me, instead of against me.

    1. Daniel Veditz wrote on

      Odd comment, why would the NSA prefer encrypted traffic to plaintext? We rather think they _won’t_ like this move.

  90. Lalo Martins wrote on

    I thought your excuse for implementing web DRM was that people would just abandon Firefox otherwise. Don’t you realize that’s exactly what will happen if you do this? I mean, what’s more likely, everyone starts encrypting stuff because otherwise the #2 browser doesn’t work properly, or everybody who’s using the #2 switches to something else that does work?

  91. Adrian Roselli wrote on

    A PDF for the FAQ? C’mon guys.

  92. Suki wrote on

    This might sound crazy but how a bout this:

    – You launch and make sure Let’s Encrypt is working properly and has been already established as an easy solution to migrate to https.
    – Talk to webhosting companies about migrating to https or offer affordable solutions to the millions and millions of websites that are currently using shared hosting accounts and have no ability to use https unless they paid an extra fee..

    Then AND ONLY THEN you start talking about deprecating http….

    How’s that sound? seems pretty logical no? that way you avoid the hordes of angry users shouting at you for segregating webpages all over the place.

    In all honestly it seems the Mozilla guys think the internet should ONLY be available for two kinds of people:

    – people with money
    – people with technical knowledge

    Which is simply insane! and makes me wonder if you guys haven’t spend way too many time in your silicon valley bubble…

    1. tfs wrote on

      Amen! The amount of shared hosted websites out there is gigantic. It’s totally pointless to talk about deprecating HTTP until these shared hosting companies do not offer these free oh-so-easy-for-everybody certificates.

  93. Roman Naumenko wrote on

    This is great news.
    Data in transit should have mandatory encryption, period.

    Are there any plans to phase out non-secure transmission over SMTP as well?

  94. negecy wrote on

    Typical american pointless actions. Think (I know, americans do anything and may think later or not) about people with less good bandwidth just want to check irrelevant information like weather, airport-city transit or similar. It’s quite unimportant if there is any surveillance on this, it’s also much better, if this information may be cached for speed-improvements. However, everything need to be https has additional negative effects, as it requires everyone to be able to have https. So it’s somehow a requirement to have sth. like Let’s Encrypt then, resulting again in damaging https as the only possibility to handle a huge amount of free certificates is somehow (full) automation, which result in any Phisher, Cyber-crime, Cyber-terrorism to have https, damaging the trust in https at all, as authentication only can be automated, which somehow has a value of zero.

  95. John Doe wrote on

    Ok so the US gov says so (no more http). Well then who runs the NSA, CIA, Homeland’Security’ and all the other law breaking agencies?

    If the US Gov says so, probably the opposite it the better way. Why you guys want to help the US gov (&co) to be the one who can intercept the web traffic? You guys really believe they don’t have their hands on the certificates? Who runs VeriSign? Who is on the top of the certificate authority structure?

    It seems you guys have too much time or get paid too well from obscure sources for esoteric purposes. In most cases, people doing so abolish themselves. You better look what Firefox already does, dozens of requests in the background, unnecessarily services with lot of bugs and so on. Maybe it is time for another Browser, one which does not impose a ton of fancy services and try to restrict which page can live or gets CENSORED (oh sure, just for our security – for our best of course! Like always).

  96. Alexander wrote on

    You should never force HTTPS.

    The win’s are rather subjective and hard to confirm.

    But using HTTPS give problems for regular webmaster.

    Website will be slower on average. Webmaster need better hardware or pay more to his hosting provider.
    HTTPS support is not always possible. For example some CDN’s can’t support HTTPS in some specific modes.
    Third-party resources linked in HTML can miss HTTPS support and it will cause website work incorrectly in HTTPS. And you need to monitor this forever… for all links on website! This point is valid for a huge % of websites.
    By enabling HTTPS-only you can easily lose 20% of visitors. Not all browsers support your certificate.
    HTTPS libraries vulnerability can lead to website’s origin server hack. The problem here, is that libraries are just like code executed directly on server. If there are vulnerability, you can not only decrypt the traffic, but also execute code on the server.
    Certificates are just bunches with problems.. revocation, revalidation, libraries deprecation. And it worth mentioning, that certificate system makes web centralized. When someone visit your HTTPS website it basically query some other central server. If someone will have this server, he can get information about all your visitors. And that’s shocky, i think.

    I am not against of encryption, but do not FORCE. HTTP is not LEGACY, it’s HTTP, the protocol which should be here forever. It’s good, fast, and well enough. That’s really tricky question does HTTPS securer than HTTP. Encryption helps sometimes to prevent injections, but it’s rather easy to bypass that. Can NSA decrypt your HTTPS? Most probably yes. Can webmater of website spy on you in HTTPS? Yes and it’s even easier with HTTPS and HSTS because of HSTS super cookie. Does HTTPS protect your password? Well, there are a chance, but if you think that HTTPS is a magic cure, you are complete idiot.

    My vote would be never use your browser if you will deprecate HTTP. That’s very easy to find an alternative or to fork you code, so think yourself how much such decision can cost you. This phrase i want also to said to Chrome dev team. Internet is live on developers. If you will start to do shit things, you will be replaced.

  97. Leniy wrote on

    Sercure https also not safe when using local cc

  98. Hubot wrote on

    This is the endpoint in a long row of false decisions.

    My free site with all my free ressources has to earn money, I cant pay all the servers out of my pocket. And its a matter of fact that your revenue drops drastically if you switch to https because many advertisers are not ready for it.

    I will stop recommending Firefox. Mozilla has become a @$%6&”

  99. Martin wrote on

    The recommendation for WoSign in the FAQ is questionable, they still use SHA1 for their own website’s certificate:

  100. KM wrote on

    HALT! Your certificate please! … No certificate? Off to the Data Gulag!

    Where was this crazy idea born? At a meeting of net-hippsters and off the reality technocrat-feminist?

    What would happen, if they succeed with this – another shortsighted feelgood solution-abduction?

    Many companies would not allow web usage anymore, encrypted traffic cannot be controlled (data leakage). Except the known friendly sites, in short google and facebook! Yes, it’s sad but for many people this would be enough.

    Power usage would multiply, new hardware, in short terms a lot of new costs and complexity.

    And more.

    Some guys and girls at Mozilla seem to be good helicopter parents with a vision, what people need and even more don’t need.

  101. Roger wrote on

    The real reason for deprecating HTTP and enforcing even non-important websites to use HTTPS, so that restrictive governments can ensure they’re arresting the right party upon mere suspicion or curiosity.

    Encryption also encourages more waste of energy versus just using plain text, and usually requiring people to upgrade to the newer and faster hardware.

    If there were other more legitimate reasons aside from fear, we would have been told by now.

    This is more like the analogy; because I do not feel safe traveling streets having rowdy bars at night, I’m going to carry a gun (or be a vigilante) versus just choosing to avoid the troubled streets at night. People have choices, and I think I’ll choose not to use encryption when I obviously do not need it the majority of my time. Makes me sick to see people devote themselves to writing code and climbing the ladder of life, only to endorse such meaingless policies for promoting controversies. What a waste of time.

    1. edison wrote on

      Can’t Agree more!

    2. Dan wrote on

      Richard Barnes (Firefox Security Lead) sold his soul to the devil and this is why he is pushing this agenda. All CAs have been compromised, which makes any SSL certificate insecure. I personally consider the PKI as good as clear text. If Barnes is a bit intelligent, he should know this. By forcing websites owners to buy SSL certificates, he is opening the door on privacy and censor those who the government do not like the content (of course to protect the poor and vulnerable children from dangerous website like wikileaks).

      1. Oliver wrote on

        Complete and utter rubbish.

        Even if a CA is compromised, you don’t give the CA your private key, they simply sign your public key and it’s up to web clients to determine if they consider your certificate valid.

        Your assertion that “All CAs have been compromised” is pure brilliance… care to produce some actual evidence to back that up?

        1. Paul M wrote on

          It doesn’t matter whether you give the CA your private key or not if the CA has been compromised, because those with control over the CA can MITM any connections you make and you’ll be unable to tell.

          1. Joao Santos wrote on

            Because non encrypted connections are way better against MITM /s

    3. Samehere wrote on

      Check out

    4. Jens wrote on

      I’m afraid he’s right. Tell people they care about security and then to use an allready compromised tecnology. So sad most people don’t know this. Conspiracy?? – well – sometimes when they cry wolf – a wolf will come. Look at documentaries on youtube about the 2008 crises – about 9/11 – how the federal reserve robs every american. So many strange things going on. Bush saying PUBLICLY “Let’s us not listen to conspiracy theories. Let us focus our time on catching the terrorrists”. If you dont want comspiracy theories, then let the public see the evidence instead of hiding 90% of it. It’s sad, and most likely they will get away with it.

  102. Roger wrote on

    Should also mention, things seem to have gone pretty well after the Bible was translated, or decrypted, into the Kings James version.

  103. Pffff wrote on

    Way to go Mozilla, to demolish your userbase; giving them to m$ and g00gl for free…

  104. Jason wrote on

    Since I’m not a techie, I don’t really understand what this means to my ability to access the sites I want. However, several of my relatives and friends have websites. If I can’t get to them (and others) using Firefox, I ‘ll use another browser.

    1. Chris wrote on

      Nothing really. In the short term nothing changes, in the long term some browser features may not work.

  105. liderbit wrote on

    Mozilla, you will lose your already decreasing number of clients in favor of chrome. Your strategy pretty much *****

  106. CoolFire wrote on

    I’m sure this is good news for the hosting providers who are still charging people for an ssl cert. And on a shared hosting platform, you generally don’t have the access to the config you need to install your own cert.

  107. M. Edward (Ed) Borasky wrote on

    While this seems wonderful on the surface, it is not cost-free to the website owner. It requires *purchase* and installation of a certificate, and a regular renewal.

    I think we need to think harder – make it free to the website owner or come up with a solution other than HTTPS.

    1. Chris wrote on

      There are free cert providers out there.

      1. Jeff wrote on

        They aren’t fully supported.

        1. Samehere wrote on

          Check out

          1. Grover wrote on

            You keep posting this like it’s the solution to everyone’s concerns, but it’s not even a live site yet. This is not a real solution until they start issuing certs and every system sees them as valid/better than self-signed.

          2. foreigner wrote on


            Just another vendor lock-in and single point of failure.

  108. open-source wrote on

    Why are you punishing open-source projects?

    There are a lot of ISPs and other companies (e.g. Blogger) who also can’t provide SSL to users who you are punishing in very large numbers with this decision.

    Code hosting sites such as Sourceforge project web also do not support SSL so you are in effect punishing a very large amount of open source projects who can’t afford their own web hosting.

    You should be working with the industry on a coordinated effort to deal with these issues instead of making yourself irrelevant to both the developer community and individual users.

    The fact that this will severely punish thousands of Sourceforge projects (some of whom your source code belongs) is very short sighted and you really should work with the industry or you will be making yourself even more irreverent.

  109. NameRequired wrote on

    No problem !, the deep web increases, those who can not afford certified disappear in the deep sea, governments will control more and better what’s on the web and I miss firefox.


  110. Luc wrote on

    Have they though about that growing number of (local) devices with a web-based UI?
    Routers and NASes are just the ones we’ve known for years already.

    I think someone at Mozilla has just bought stakes in a certificate authority, and is now pushing to make a certificate for each IoT device a requirement to get it up and running.

    1. 22decembre wrote on

      Mozilla finance itself by auditing CA before including them in the browsers.

      So they have not bought shares there: they charge them ! And they are mounting their own free CA also !

      1. Daniel Veditz wrote on

        The CAs pay no money to Mozilla for inclusion in the browser. We do require audits but we do not perform them, and we do not receive payments from the auditors. Maintaining the list of CAs in our browser is strictly a money-losing proposition for us (but necessary and good for the web).

  111. NameRequired wrote on

    Instead of blocking self signed certificates and highlighting some special certs, Firefox should show a ranking how trustful the certificate is (manually added, selected CAs, official CAs, CAcert, selfsigned) in form of a traffic light.
    Additionally it should show how secure the connection is (PFS, cipher,hash etc.).

  112. Jeff wrote on

    You know the only way this could work is if SSL certificates were secure and free.

  113. Samehere wrote on

    Just a short wait. Check out

  114. grin wrote on

    Funny how everyone tech-savvy kind of ignore the entropy problem. A normal website host can handle millions of connections easily while simply choke and die on a moderate amount of secure connections which require lots of random numbers which require lots of entropy. Ever wondered why businesses selling entropy sources are in the business? Yeah, you may say “hey if you want high traffic spend the money”, forgetting that that’s what the fuss is about. Many people do not want to pull up a server farm to serve the pages or try to get entropy somewhere. Virtual servers? Oh yeah, even more fun about entropy.

    Okay, so you don’t want your shiny new https server to wait tens of seconds for blocking random numbers so you use urandom. Which means _pseudo_ random. More entropy you draw, more pseudo. Less random. Less security in encryption.

    And yes, setting it up is a great hassle, and doing IP virtual hosting is a hassle (yeah get more ipv4, oh, you mean we’ve run out 2 years ago? what ipv6? where, when?).

    But apart from all there are still units with 10+ years old code running on their management cards. They will support https approximately in 1st april, 2048.

  115. Mozinet wrote on

    A Mozilla FAQ in PDF? Really? It’s not cool for readability on mobile, accessibility and SEO.

  116. F. Ree wrote on

    Once upon a time “free software” with “free” as in “free speech” meant that the user was empowered and could do what he or she wanted.

    For quite some time now, Firefox, Thunderbird and Mozilla have developed into net-nannies, telling the user what they (Mozilla) feel is good or bad for them. Maybe that’s a good approach for digital stupids but it certainly is not a good idea for that user community which initially helped make Firefox was it is today.

    Maybe some smart person will come up with the idea of making all of this an option which easily can be turned on or off. If not…well…we will have to use a “more” “free” browser, when Mozilla ruins it again.

    Somehow it looks like every good browser has to crash in numbers once in a while to remind the second generation of developers what had been the reason to develop it for the first generation of developers.

    TL;DR: Cut that crap, Mozilla and keep Firefox FREE. And learn what FREE software means if you forgot that…

  117. Sigh wrote on

    It’s almost like Mozilla is intentionally killing FireFox. I can hear the complaints now. “Why is this website showing a security error? I use it all the time and it works fine in Chrome.”


    1. Sighing louder wrote on

      That’s right. Even now Firefox (fav browser, for now) occasionally tries to ruin my web surfing by choosing for me what’s risky and what’s not – on absolutely clean web resources that I have to either add to “trusted websites” then or – simply – open in Chrome. Can’t imagine what could happen when these “removing capabilities” takes place…

  118. Fx-User wrote on

    This encryption idea should be an option that the computer’s administrator decides on. Like the extension signing idea, there is no flexibility given to the user. The thing that got Firefox rolling was the ability to have web browsing as the user chooses, through extensions and so forth.

    This change into edicts is a big mistake.

  119. Enrique wrote on

    For once I do not agree at all with this decisition. TLS requires a complex implementation that greatly augments the possibility of remote attacks. At least for local intranet and loopback connections it must exists support for non bloated HTTPS.

  120. Dave Ross wrote on

    Forcing the use of HTTPS will not necessarily guarantee security… I’d call it with its name: YACBA (yet another captive business attempt).

    What’s wrong with letting the world be FREE (even to make mistakes)?

    I really cannot understand your way of doing things… firefox is less stable tha it should be, it will force people to choices the are not willing to make… do you realize that you are going to lose a part of the browser share that was built very hardly in the past years?

    My 2 cents.

  121. Mildred Ki’Lya wrote on

    A design issue raised by Tim Berners-Lee with https:

    Don’t break the web

    There is a currently (2014, 15) a massive move to get the web secure in the sense of encrypted and authenticated. Of encryption and authentication, the encryption part is the part which has garnered the most attention, both among its promoters and those in governments who protest against it has giving too much power to users, criminals included, compared with law enforcement. Projects such as LetsEncrypt and the EFF’s HTTPS everywhere for example promote a wholesale move to the HTTPS protocol.

    The concerns behind the need for security are valid. There is a lot of abuse which it would prevent. The problem with HTTPS Everywhere drive is when the “S” is put into the URI. The problem is of course that moving things from http: space into https space, whether or not you keep the rest of the URI the same, breaks any links to. Put simply, the HTTPS Everywhere campaign taken at face value completely breaks the web. In a way it is arguably a greater threat to the integrity for the web than anything else in its history. The underlying speeds of connection of increased from 300bps to 300Gbps, IPv4 has being moved to IpV6, but none of this breaks the web of links in so doing.

    TLS Everywhere

    A proposal then is to do HTTPS everywhere in the sense of the protocol but not the URI prefix. A browser gives the secure-looking user interface message, such as displaying the server certificate holder name above the document, only when the document has been fetched in an authenticated over an encrypted channel. This can be done by upgrading the HTTP to include TLS in real time, or in future cases by just trying encrypted version first. There has been some discussion of this from including a RFC2817 (2000) “HTTP Upgrade to TLS” (Though that was motivated apparently by the need to save low-numbered ports, an issue I omitted from the table above.).

    The HTTP protocol can and by default is upgraded to use TLS without having to use a different URI prefix. The https: prefix could even in fact be phased out, and instead user education focussed on understanding the level of assurance being given about the level of security, including authentication of the other party, encryption of the communication, and the anonymity, traceability, or strong authentication of the user to the other party.

  122. Andrea Ronchetti wrote on

    But if i want to see an html page which is saved in my hard disk, can i do it? And with software as EasyPhp there will be some problems?

  123. Victor wrote on

    This is really Mozilla blog? The blog of my favourite browser? I do not believe. To be honest, I think this decision contradicts your principles.

    And yet, how do you think, why the non-profit sites and personal blogs should have an HTTPS certificate?

  124. Dag wrote on

    Add built-in support for RFC 6698 (Dane) first. Today, or at least this year. In all major browsers. Then hosting providers can add HTTPS on an industrial scale, using DNSSEC, TLSA-records in DNS and self-signed certificates, bypassing all the hassle and security issues of CAs. This is the ONLY way to get a huge proportion of websites to support HTTPS. Oh, BTW, that might interfere somewhat with the revenue stream from CAs to Mozilla, maybe.

  125. Dan B wrote on

    Is this a late april fools joke?

  126. Sérgio Carvalho wrote on

    Firefox has a user share problem. Enlarging the user base and stopping user loss should be your first and foremost priority. Limiting browser features is so obviously wrong that it shouldn’t warrant explanation.

    This decision reeks of dictatorial power of a deluded, once powerful, but no longer influential, dictator.

    The result isn’t that webmasters will heed to Mozilla, it will be further descent into irrelevance as users flock to browsers that work.

  127. SjorsK wrote on

    Even though I am a huge fan of encryption, I believe that the certification system as-is gives false trust. I would love it if things where made a little bit easier to mark self signed certificates as more secure than pure http. I do believe that identity validation is a good thing to do when setting up a business but this will sincerely hurt websites that are set up as a hobby.

  128. Erm wrote on

    I do a lot of development at home. My test server is in my lan and if I understand what your suggesting correctly (at least according to your faq) “As noted above, everything that works today will continue to work for a while, so we have some time to solve this problem.” I’m not going to be able to get the full range of features firefox offers for my local dev server inside 192. because it doesn’t have a cert.

    What if I want to develop a site that uses and tags and you add new features?! I won’t be able to use them!

    I’ll have to set up nginx, generate a self singed cert and have it proxy to the http… what a waste of cpu & dev time. Just so I can keep using flask.

  129. clem wrote on

    Certification is a centralized system, it’s a freedom issu

  130. Andrew Aitchison wrote on

    I was taught that one factor that aided cracking the German Enigma ciphers was that the Luftwaffe used the same encryption for the weather forecast and for top secret messages.

    Encrypting everything may improve privacy, but are you sure that it wont reduce the security of the most
    secure infomation ?

  131. Owl wrote on

    When I talk to my friend in the coffee shop there is a danger that we will be overheard and our privacy violated. Obviously the solution is for coffee shops to install loud white noise generators to make voice communication difficult… and then I will be much more secure as I pass written notes back and forwards to my friend.

  132. Cos wrote on

    In your FAQ, would you please replace “IT guy” with a less-gendered term?

  133. Valtteri wrote on

    I’m going to drop Firefox support from my site. I’m very disappointed.

  134. Kaos wrote on

    Such a bad idea!

    I will start to inform our users to change web browser from Firefox to Google Chrome.

  135. SJD wrote on

    No Firefox user agents on my page anymore!

    Was quite simple:

    RewriteCond %{HTTP_USER_AGENT} Firefox/29 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} “Firefox/[3-9][0-9]” [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} “Firefox/[0-1][0-9][0-9]” [NC]
    RewriteCond %{REQUEST_URI} !^/redirectpage\.txt$
    RewriteRule ^(.*)$ /redirectpage.txt [L]

    Banning newer versions after the end of “Firefox classic” above

    or more absolute one here beyond:

    RewriteCond %{HTTP_USER_AGENT} ^.*Firefox/*.*$ [NC]
    RewriteCond %{REQUEST_URI} !^/redirectpage\.txt$
    RewriteRule ^(.*)$ /redirectpage.txt [L]

    Replace “redirectpage.txt and ” redirectpage\.txt$ with the name you want Firefox users to be redirected instead of your normal page

    Bye bye Firefox!

  136. Owl wrote on

    It seems to me that some people here are over-reacting and making changes before this is implemented and we see how it pans out, or whether Mozilla folk even change their minds.

    I would like it if we could use https everywhere… but the reality is that there are quite a few countries which block non-whitelisted https sites.
    Major sites such as banks can get onto the whitelist… but many lesser sites never will.

    The danger of switching to https everywhere is that you then cut off a lot of users in these countries.

    We can hope that going to https will build pressure for these countries to relax their policies but I think that is naive – what it will do is build pressure for them to get the same control over CAs/root stores that the NSA must already have.

    (Sure users can run a VPN but that is a game ouf cat and mouse which I suspect is going to end badly for the VPN users)

  137. Aditya wrote on

    Apparently the people at the top of Mozilla with pockets filled with money can’t understand how hard it is to implement HTTPS for small websites owners. Personally i prefer visiting sites in HTTPS especially for big sites or to be precise it’s almost a requirement for those big sites to use HTTPS. But when you’re running your own sites it’s different especially if those sites are just small sites that don’t generate income or the income generated were too small. Here’s why:

    – Most sites are hosted on shared hosting plan or cheap VPS with very restricted resources usage. Adding HTTPS will going to get you kicked due to resources usage being exceeded especially if your sites having many visitors but you’re not generating money/not enough reason to buy certs for those sites. (here’s an example from one of the big hosting provider at for example. They even tell you to avoid HTTPS as much as possible).

    Quoted from their page (also screenshot

    “Avoid using https protocol as much as possible; encrypting and decrypting communications is noticeably more CPU-intensive than unencrypted communications.”

    – I keep seeing someone saying letsencrypt being promoted here and there in this comment area. Will letsencrypt give you wildcard certs for free?. Some people prefer to use subdomain for static files, and also for other reasons. And wildcard certificate isn’t cheap.

    – HTTPS require 1 ip address per-certificate. Yes, i know about SNI but what if for some reason i/the admin/the webmaster don’t want to use it due to privacy reason?. Because if you only have 1 ip address, and you’re being forced to use HTTPS, all your sites need to be listed there in the certificate, and all your domain names then can be seen by everyone by simply looking at the certificate. And this violates my privacy.

    – If you say there’s Cloudflare. Can you even guarantee they will last forever? and besides if you’re talking about privacy and security, giving your sites statistics to Cloudflare which is a third party contradict what you’re trying to achieve. And you’re trying to force website creators/owners to give their statistics to a third party.

    – Mom and pop store that generate small revenue can’t use HTTPS or won’t bother to use it due to it’s adding more cost. Most of them are hosted on shared hosting plan that doesn’t allow you to change configurations so you can’t use those free ssl cert from StartSSL or any other free cert which unfortunately till now there is exist only 2, and soon to be 3 if letsencrypt accepted by major browsers.

    Also if you’re thinking those mom and pop store are running full blown ecommerce solution like with add to cart and checkout function then you’re dead wrong because mom and pop store mostly use static html or a simple wordpress website. And when someone ordered something from the site it is by phone or email.

    In short HTTPS is still too expensive for small website(s) owners/creators.

  138. Bob wrote on

    Dear Moz,

    Would you be a dear and moderate these comments? There seems to be a certain someone with anger issues who keeps posting here.


    PS. BTW, thanks pushing the web towards encryption as default.