Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 14 submissions in the queue.
posted by n1 on Wednesday November 19 2014, @07:09PM   Printer-friendly
from the free-market dept.

The EFF will be offering free CA to people to help the web move to HTTPS from HTTP. This will be launching in 2015.

EFF is pleased to announce Let’s Encrypt, a new certificate authority (CA) initiative that we have put together with Mozilla, Cisco, Akamai, IdenTrust, and researchers at the University of Michigan that aims to clear the remaining roadblocks to transition the Web from HTTP to HTTPS.

Although the HTTP protocol has been hugely successful, it is inherently insecure. Whenever you use an HTTP website, you are always vulnerable to problems, including account hijacking and identity theft; surveillance and tracking by governments, companies, and both in concert; injection of malicious scripts into pages; and censorship that targets specific keywords or specific pages on sites. The HTTPS protocol, though it is not yet flawless, is a vast improvement on all of these fronts, and we need to move to a future where every website is HTTPS by default.With a launch scheduled for summer 2015, the Let’s Encrypt CA will automatically issue and manage free certificates for any website that needs them.

https://letsencrypt.org/

Related Stories

Let's Encrypt Has Issued Its First Gratis SSL/TLS Certificate 19 comments

Josh Aas of The Internet Security Research Group reported on September 14:

Let's Encrypt passed another major milestone by issuing our first certificate. You can see it in action here

Our cross signature is not yet in place, however this certificate is fully functional for clients with the ISRG root in their trust store. When we are cross signed, approximately a month from now, our certificates will work just about anywhere while our root propagates. We submitted initial applications to the root programs for Mozilla, Google, Microsoft, and Apple today.

We're thrilled to finally be a live [certificate authority]. We'll be working towards general availability over the next couple of months by issuing certificates to domains participating in our beta program. You can request that your domain be included in our beta program by clicking here.

If you want to get involved with Let's Encrypt, please visit this page.


See our prior coverage: EFF Offers Free Certificate Authority to Dramatically Increase Encrypted Internet Traffic, The "Let's Encrypt" Project Generates Root and Intermediate Certificates, and "Let's Encrypt" gets a Launch Schedule.

Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by mechanicjay on Wednesday November 19 2014, @07:19PM

    by mechanicjay (7) <{mechanicjay} {at} {soylentnews.org}> on Wednesday November 19 2014, @07:19PM (#117807) Homepage Journal

    You can get a free ssl cert today, I can generate one in about 2 seconds.

    The problem is a matter of trust. So what they're offering a free signing authority, which has a widely-distributed trusted root cert. I personally find the concept kinda bogus -- that a self-signed cert is reported as "Dangerous and legitimate sites won't display this warning" by browsers. I guess this is a step in the right direction?

    Of course, I'm not sold on the let's encrypt the entire web movement either, so I'm the worst one to judge.

    --
    My VMS box beat up your Windows box.
    • (Score: 2) by ticho on Wednesday November 19 2014, @07:41PM

      by ticho (89) on Wednesday November 19 2014, @07:41PM (#117818) Homepage Journal

      Exactly. The entire SSL certificate thing is a farce, intended to make money for the CAs. The sooner regular people stop believing that green bar in browser address bar means they're safe, the better. This EFF initiative does not help at all.

      • (Score: 2) by mechanicjay on Wednesday November 19 2014, @07:50PM

        by mechanicjay (7) <{mechanicjay} {at} {soylentnews.org}> on Wednesday November 19 2014, @07:50PM (#117822) Homepage Journal

        Well, it sort of helps in that it makes the greenbar free for anyone. Which ultimately cheapens the "brand". Which is a good thing. Ensuring that you're talking to who you think you're talking to on the internet is really an exercise in futility.

        --
        My VMS box beat up your Windows box.
    • (Score: 2) by Sir Garlon on Wednesday November 19 2014, @07:46PM

      by Sir Garlon (1264) on Wednesday November 19 2014, @07:46PM (#117820)

      A self-signed certificate provides assurances of confidentiality but not identity (of the server). I think both are important, but if confidentiality is your primary concern then I agree a self-signed cert should be fine. Non-technical users probably underestimate the possibility that the web site they are visiting could be a fake (through for instance a malicious link, or, less likely, a DNS compromise) so I believe they need that identity guarantee.

      --
      [Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.
      • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @08:29PM

        by Anonymous Coward on Wednesday November 19 2014, @08:29PM (#117834)

        > if confidentiality is your primary concern then I agree a self-signed cert should be fine.

        Except for the fact that all the browsers throw a fit when you run into a self-signed cert so they are no good for general purpose use.

        What we need is better handling in the browser of the split between identity and confidentiality, right now they are basically one and the same. Ideally ALL traffic would be encrypted and we would only get warnings about anomalies like unencrypted traffic and changed certs (the movement towards certificate pinning by default will help with that).

        • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @01:09AM

          by Anonymous Coward on Thursday November 20 2014, @01:09AM (#117935)

          And with good reason. A self-signed certificate is only as good as a closed but unsealed letter: anyone who controls hardware between you and the target host can intercept it provided they don't care about the legal and moral implications.

          You don't want to mislead people into believing that the traffic is more secure than it is. Trust-free encryption only protects you against skript kiddies with questionable sensibility, not against targeted attacks from a rogue government.

      • (Score: 2) by Thexalon on Wednesday November 19 2014, @08:32PM

        by Thexalon (636) on Wednesday November 19 2014, @08:32PM (#117836)

        A self-signed certificate provides assurances of confidentiality but not identity (of the server).

        The reason the CA concept exists is the risk of a man-in-the-middle attack. I don't have to compromise DNS to do it either, if I have control of any router between the client and server.

        To use the classic model of Alice (client), Bob (server), and Eve (eavesdropper):
        1. Alice sends the initial request to Bob, with no encrypted information because Alice doesn't have Bob's public key yet.
        2. Eve intercepts Alice's request, and proxies the same request to Bob.
        3. Eve receives the reply from Bob and decrypts it using Bob's public key.
        4. Eve then encrypts the reply using Eve's private key, and sends Eve's public key saying it's Bob's public key.
        5. Alice happily replies back to Eve, thinking she's talking to Bob. Eve decrypts, re-encrypts to proxy to Bob using Bob's actual public key.
        This conversation then continues with Eve being able to read the traffic without any difficulties.

        The only way to prevent this is if Alice knows for a fact that what appears to be Bob's public key is in fact Bob's public key. Where the CAs are supposed to come in is that public keys (a.k.a. certificates) that are signed have been demonstrated to be actually from the server they say they're from.

        Of course, if you can fake the CA signing, then you can still do this attack and Alice and Bob will be none the wiser.

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
        • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @09:28PM

          by Anonymous Coward on Wednesday November 19 2014, @09:28PM (#117855)

          Like he said, a self signed cert means that you are creating an encrypted tunnel, just that you don't actually know WHO is at the end of that tunnel.

          Doing so, is no less secure than simply using http (and ever so slightly more so), and thus FF should simply not show that the connection is secure (as chrome does) rather than throw the hissy fit is does today.

        • (Score: 2) by FatPhil on Thursday November 20 2014, @10:51AM

          by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Thursday November 20 2014, @10:51AM (#118061) Homepage
          You don't need to "fake" the CA signing, just get Honest Achmed's CA to sign it for you.
          (https://bugzilla.mozilla.org/show_bug.cgi?id=647959 , which is even more amusing since 2013)
          --
          Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 2) by TheLink on Thursday November 20 2014, @05:56PM

        by TheLink (332) on Thursday November 20 2014, @05:56PM (#118182) Journal

        A self-signed certificate provides assurances of confidentiality but not identity (of the server).

        In the ivory tower academic theory world you are right.
        In the real world, self-signed certs can actually be more secure than the existing system where your security depends on the weakest crappiest or most untrustworthy CA among the hundreds of CAs that your browser trusts (either directly or chained):
        https://blog.mozilla.org/security/2013/01/03/revoking-trust-in-two-turktrust-certficates/ [mozilla.org]
        https://www.schneier.com/blog/archives/2011/03/comodo_group_is.html [schneier.com]
        http://www.bbc.com/news/technology-14819257 [bbc.com]

        In the real world with self-signed certs you'd be pretty certain it was the same entity it was for the past visits (which could either be the actual server or the same entity doing an MITM on you ;) ).

        That is more than you can say for the existing browser CA system. Because with the exception of special cases like pinned certificates most browsers won't warn you if the cert or even the CA has been changed, as long as the cert is signed by a CA recognized by the browser. And so you can get MITMed without any warning.

        To give you a real world example. Say you're in the USA using your corporate US employee only HTTPS website to check your top secret corporate webmail. Then you go on a business trip to China and check your webmail using your laptop.

        With the existing CA system and browsers as they are (without 3rd party add ons) the Chinese Government can MITM your corporate webmail by just using a suitable cert signed by CNNIC or similar. None of the browsers by default would warn you- the cert hostname matches your corporate webmail hostname, the CA is CNNIC which is accepted by all popular browsers. It's not just China - other governments or even organizations can MITM you in a similar way.

        In contrast if there wasn't this CA system and you only had self-signed certs you've been using without any problems in the USA, the Chinese Government would not be able to MITM you without your browser warning you.

        It's like the ssh server key warning you get on the first time visit. If you're connecting to the server for the first time, yes there is a chance that someone could happen to be doing an MITM on you at that very moment, so if you're paranoid you should check some other way. But in practice it's a very small chance and not that likely you'd be MITM just at that moment especially if it's on a network you can trust. Then when you go to somewhere else and get a warning, you know something is wrong.

        Now that's why some people use 3rd party plugins like Certificate Patrol. The problem with Certificate Patrol is it doesn't support more than one cert per host. This is fine for most corporate setups but not so much for gmail and other services where multiple different certs are used.

        By the way, if you're using IE or Chrome (on windows) be aware that CA certificates can get auto-added if Microsoft or some other CA vouches for them even if they don't appear at first or are deleted. Don't believe me? Go visit an https website of some CA. Delete that CA's cert from the list of CAs. Go visit the website again. Notice that there are no warnings and notice that the CA's CA cert appears again. Basically if you don't trust certain CAs you have to get ALL of their certs in your repository and then disable ALL of them. Good luck with that. I'm not sure if your OS will work perfectly if you disable Microsoft's CA cert (and if I recall correctly Microsoft's cert is all you need enabled for some other CA's to be added).

        I think it's similar for Firefox too but at least they have a bigger existing list that you can go through to disable.

        See also: http://netsekure.org/2010/04/how-to-disable-trusted-root-certificates/ [netsekure.org]
        https://bluebox.com/technical/questioning-the-chain-of-trust-investigations-into-the-root-certificates-on-mobile-devices/ [bluebox.com]

        Lastly, the great thing about this Free CA would be that time and money would be saved - people will get fewer useless warnings that half of them would click through and not remember anyway. Who really cares or knows about security anyway? Certainly not the people making the browsers, nor most of the people using them.

    • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @08:11PM

      by Anonymous Coward on Wednesday November 19 2014, @08:11PM (#117827)

      Mod parent up. Wikis and open source are a great technology for many things; I don't think acting as a CA is one of them.

    • (Score: 3, Informative) by morgauxo on Wednesday November 19 2014, @09:03PM

      by morgauxo (2082) on Wednesday November 19 2014, @09:03PM (#117846)

      Something I am really tired of is how browsers handle self-signed certificates. I've tried to use them for my own personal web applications which were only meant to be used (& trusted) by ME. They always give you that popup warning you about the potential danger. (good) They usually have a button to click to trust the certificate anyway and a checkbox to trust it permanently. (good, I should be able to trust MY OWN certificate!).

      Then the F#$%G Browser goes and pops the same @#%$# message at me again every time even though I click the box to always trust this certificate! WTF! Apparently I HAVE to pay someone to use SSL without being constantly nagged. Why do they even have those checkboxes if they are going to consistantly ignore them?

      I've seen this behavior on all of the major browsers.

      • (Score: 2) by Leebert on Wednesday November 19 2014, @10:13PM

        by Leebert (3511) on Wednesday November 19 2014, @10:13PM (#117875)

        Haven't seen that behavior. (Actually, thinking of it, I *might* have seen it in IE; can't recall for sure)

        But have you considered just generating your own CA and importing it into the appropriate certificate store(s)? Might be simpler in the long run.

        • (Score: 2) by ancientt on Thursday November 20 2014, @12:56AM

          by ancientt (40) <ancientt@yahoo.com> on Thursday November 20 2014, @12:56AM (#117930) Homepage Journal
          Good point. We have a couple vendors that have used or are using self signed certs. I set up our LAN so that workstations recognize the ones they're normally supposed to interact with so that they don't get warnings. Every time a regular user gets a cert warning, IT should know that something is likely amiss since what should be trusted on our network will be trusted. For the ones that average workers shouldn't be interacting with, but I do, I put the add-on in Chrome (my usual browser for those types of sites) so that I can bypass the check for certs I decided to trust. It was pretty simple to work with so you might consider that if a network or computer modification to trust a root isn't the solution you need.
          --
          This post brought to you by Database Barbie
      • (Score: 2) by bryan on Thursday November 20 2014, @12:54AM

        by bryan (29) <bryan@pipedot.org> on Thursday November 20 2014, @12:54AM (#117928) Homepage Journal

        Or you could install your self-created root CA certificate to your trusted authority list (Firefox example [cyberciti.biz]) on the computers you use before going to your HTTPS site.

        Then the process works as intended without any nasty warning messages.

      • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @10:58AM

        by Anonymous Coward on Thursday November 20 2014, @10:58AM (#118063)
        Did you make sure the hostname on your URL matches the CN on your cert?

        If you didn't I'm not surprised if browsers keep giving you warnings.
        e.g. your cert is www.testserver.here but your url is https://10.1.1.2/

        One workaround is to edit your hosts file so that www.testserver.here = 10.1.1.2 and then use https://www.testserver.here/
    • (Score: 2) by hemocyanin on Wednesday November 19 2014, @11:19PM

      by hemocyanin (186) on Wednesday November 19 2014, @11:19PM (#117894) Journal

      Of course you can. But then there is the scare warning and multiple steps a user must go through to accept the cert and access your site.

      I think what the EFF is doing here is awesome -- there are probably millions of low volume websites that aren't about making any money, and even though the cost can be insubstantial for a basic SSL cert. Random googling got me this: https://www.ssls.com/comodo-ssl-certificates/positivessl.html?years=5 [ssls.com] .... of course, that's with a 2048 bit key rather than the 4096 that's becoming popular. And while $5/yr sounds nice, you have to buy five years at once, which probably isn't going to bankrupt anyone, but it is enough to make someone think "forget it". But for free, and without scaring off users, there is no excuse. That's why I say this is a great thing EFF is doing.

      Of course, if your website demands the ultimate trust between you and users, then a self-signed cert is the way to go because you probably aren't feeding info to the NSA like some el-cheapo provider, or expensive provider may be. I'm guessing the EFF would also not share with the NSA but you really can't trust a third party in America due to the Third Party Doctrine exception to the 4th Amendment. But for your site devoted to $randomHobby, these EFF certs will be perfect.

  • (Score: 1, Interesting) by Anonymous Coward on Wednesday November 19 2014, @07:26PM

    by Anonymous Coward on Wednesday November 19 2014, @07:26PM (#117811)

    Is Google on board with this? Thanks to Chrome, they're the only browser vendor who matters. They will make or break this project.

    Mozilla is irrelevant now. Firefox is at about 10% of the combined desktop and mobile market, and this number is dropping. So Mozilla has basically no influence any longer.

    • (Score: 2, Informative) by Ethanol-fueled on Wednesday November 19 2014, @10:05PM

      by Ethanol-fueled (2792) on Wednesday November 19 2014, @10:05PM (#117870) Homepage

      Google, get out. Your browser sucks ass and has always sucked ass.

      • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @11:12PM

        by Anonymous Coward on Wednesday November 19 2014, @11:12PM (#117891)

        Different AC here. How is what you say relevant? Chrome being a shitty browser (which it is, obviously) is orthogonal to the fact that it's still the most widely used browser these days by a huge margin. It's also orthogonal to the fact that very few people use Firefox these days, and these few remaining users are the only way Mozilla can exert any sort of influence at all.

        Unless Chrome accepts these certs, they'll be fucking useless.

        • (Score: 2) by Bot on Wednesday November 19 2014, @11:28PM

          by Bot (3902) on Wednesday November 19 2014, @11:28PM (#117900) Journal

          The problem is that if site X works with IE, Firefox, Safari, and not Chrome, people will perceive a problem with Chrome.

          --
          Account abandoned.
          • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @12:15AM

            by Anonymous Coward on Thursday November 20 2014, @12:15AM (#117916)

            That's not how it works when over 60% of web users are using Chrome, 20% are using IE, 10% are using Firefox, and the rest are using Opera and other totally irrelevant browsers. At least 60% of web users will see any web site using one of these certs as broken. Might makes right.

            Of course, it didn't have to be this way. Mozilla didn't have to drive away most of Firefox's users due to really fucking idiotic changes over the course of many years. Mozilla would still be relevant if they had 30% or more of the browser market. But their current 10% means they're nothing.

      • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @12:09AM

        by Anonymous Coward on Thursday November 20 2014, @12:09AM (#117914)

        Could be worse. Could be firefox.

        • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @12:29AM

          by Anonymous Coward on Thursday November 20 2014, @12:29AM (#117920)

          LOL! Everyone hates Firefox.

      • (Score: 3, Interesting) by mojo chan on Thursday November 20 2014, @12:34PM

        by mojo chan (266) on Thursday November 20 2014, @12:34PM (#118081)

        Insightful? You don't even say why Chrome sucks ass, whatever that means.

        Chrome is fast, reliable, and Google doesn't randomly break the UI and a few extensions every six months. Most of all it's secure, because it has an excellent security sandbox model (see pwn2own results, or yearly browser bug stats). Maybe you don't like Google, but you can always use Chromimum or another build.

        So, what do you mean by "sucks ass" in this context? You prefer being screwed by Mozilla instead? You like ads on your new tab page? Insecure and brittle plug-ins? Sounds awesome.

        --
        const int one = 65536; (Silvermoon, Texture.cs)
        • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @12:46PM

          by Anonymous Coward on Thursday November 20 2014, @12:46PM (#118086)

          Chrome's UI is total shit. There's always the risk of it making contact with Google behind-the-scenes. Its "excellent security sandbox model" means that extensions can't do jack shit, making most of them crippled or useless.

          And how the fuck do the problems you list with Firefox somehow make Chrome a better option? They don't! They just show that Firefox and Chrome are both total crap!

    • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @01:28AM

      by Anonymous Coward on Thursday November 20 2014, @01:28AM (#117938)

      BS, Mozilla is at 20%.

      • (Score: 0) by Anonymous Coward on Thursday November 20 2014, @02:42AM

        by Anonymous Coward on Thursday November 20 2014, @02:42AM (#117958)

        If by 20% you mean 10%, then, yes, you are correct.

    • (Score: 2) by mtrycz on Thursday November 20 2014, @11:35AM

      by mtrycz (60) on Thursday November 20 2014, @11:35AM (#118069)

      And Google != Chrome.

      Google is evil, and while they have the resources to build a world class browser (that tracks your every single step) they also do a lot of other things. Their mission is money, just that.

      Mozilla is not purely evil yet, and while they have a good* browser (that coincidentally doesn't track you) it's just a part of a large initiative for the whole of Web. Their mission is a more open web, and while I might not share all of it, it's still great to have Mozilla around. The internet would not be the same without Mozilla, moving those steps back then, and doing so now.

      * Firefox is a good browser, if you disagree state exactly what's wrong or gtfo. It has quircks and I don't share the UI choices, but I'll gladly trade the quirks for keeping my soul to myself.
      Also, In ANY way what Firefox is now is relevant to other Mozilla initiatives, which are quite fine if you ask me.

      --
      In capitalist America, ads view YOU!
  • (Score: 3, Funny) by BananaPhone on Wednesday November 19 2014, @07:36PM

    by BananaPhone (2488) on Wednesday November 19 2014, @07:36PM (#117816)

    "we have put together with Mozilla, Cisco, Akamai, IdenTrust, and researchers at the University of Michigan"

    It's based in land of the watched and home of the slaves.

    It's one NSL away from being compromised.

    • (Score: 2) by GungnirSniper on Wednesday November 19 2014, @07:39PM

      by GungnirSniper (1671) on Wednesday November 19 2014, @07:39PM (#117817) Journal

      That was my thought as well, but where else could it be based that isn't going to be compromised either by the government there or by US government pressure on that government?

    • (Score: 2) by tempest on Wednesday November 19 2014, @08:01PM

      by tempest (3050) on Wednesday November 19 2014, @08:01PM (#117825)

      Compromising this wouldn't gain them anything, except maybe more information for their meta database. Compromising a CA only allows them to do a man in the middle attack, which they can already do.

    • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @08:38PM

      by Anonymous Coward on Wednesday November 19 2014, @08:38PM (#117838)

      > It's one NSL away from being compromised.

      Don't be one of those guys who think that if it isn't perfect it isn't worth doing.

      Yes it is vulnerable, but the goal is to make mass surveillance too expensive. Combined with PFS (perfect forward secrecy) it doesn't matter if the certs are compromised, they can't retroactively decrypt all traffic, they can only man-in-the-middle specific targets and it seems infeasible to try to MITM everybody because that would be orders of magnitude more work than simple passive recording of all traffic.

    • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @09:28PM

      by Anonymous Coward on Wednesday November 19 2014, @09:28PM (#117856)

      Agreed. They should base it in Russia or China where it will be safe from being compromised.

      (That's sarcasm, by the way)

  • (Score: 1) by NotSanguine on Wednesday November 19 2014, @08:30PM

    by NotSanguine (285) <reversethis-{grO ... a} {eniugnaStoN}> on Wednesday November 19 2014, @08:30PM (#117835) Homepage Journal

    It seems to me that more encryption (I'm talking confidentiality here, not identity) is better.

    Obviously, for certain network connections (email providers, bank sites, shopping and other commercial sites) identity is also important.

    However, given that data traversing the nets is being sucked up into the vast US "intelligence" machinery, as well as by other state and non-state actors, encrypting network traffic (NB, while web traffic is what people think about most, it's not the only sort of network traffic) is a win even if you can't verify the real identity of a site you visit.

    Saying that encryption is worthless without identity is making the perfect the enemy of the good, IMHO.

    Given that the biggest threat to *encrypted* communications is the MITM/proxy attack (DNS hijacking is also an issue, but one whose incidence is orders of magnitude smaller), perhaps a mechanism simpler than x509 certs could help.

    IPSec is supported natively in IPv6 and implementations for IPv4 are available for just about any platform you'd care to name. If a protocol were to utilize the IP address/port combination (in addition to other characteristics) in generating encryption keys for IPSec connections, with an additional path check to (at least attempt) to verify that the site which claims to host a specific IP address is on the network it claims to be, could provide widespread encryption with minimal impact.

    Yes, this is easily broken by proxy servers or a determined MITM attack. However, it would be far more difficult to do so *transparently* -- giving the user more knowledge about who or what is inspecting their traffic.

    No. Such a protocol would not fix everything, nor would it give us any guarantee that our communications weren't being intercepted (via proxies, legal/hacking activity on the endpoints, DNS hijacking, etc.), but it could dramatically decrease the amount of unencrypted traffic traversing the 'net.

    Obviously, wide-scale adoption of any such protocol would be the biggest stumbling block for such a protocol. However, it's clear that things need to change.

    Ubiquitous network encryption is a worthy goal, with or without identity verification.

    Perhaps my suggestion isn't the best one (it certainly has plenty of edge cases which would limit its efficacy significantly), or even a particularly good one, but it would better than what we have now. x509 certs/SSL keys are clearly not the best solution. Rather than throw up our hands, perhaps adding additional layers of potential encryption could help to thwart those who wish to spy on us.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
    • (Score: 3, Insightful) by snick on Wednesday November 19 2014, @08:58PM

      by snick (1408) on Wednesday November 19 2014, @08:58PM (#117845)

      Ubiquitous network encryption is a worthy goal, with or without identity verification.

      huh?

      Why bother with encryption if you have no idea whether the other end of the encrypted pipe is your friend or a MITM?

      That would be about as effective as just using ROT13.

      Accept encryption w/o verification and your own ISP will MITM you on the first hop so they can shape your traffic.

      • (Score: 2) by tibman on Wednesday November 19 2014, @09:53PM

        by tibman (134) Subscriber Badge on Wednesday November 19 2014, @09:53PM (#117864)

        So self-signed certs are equally useless then? I think not. Verification doesn't require a CA though that is certainly the easiest way.

        --
        SN won't survive on lurkers alone. Write comments.
        • (Score: 2) by snick on Thursday November 20 2014, @01:07AM

          by snick (1408) on Thursday November 20 2014, @01:07AM (#117934)

          Self signed certs are useless *unless* you manage the trust yourself, and then they are more secure than public CA certificates, but managing that trust is a lot of work.

      • (Score: 1) by NotSanguine on Wednesday November 19 2014, @10:06PM

        by NotSanguine (285) <reversethis-{grO ... a} {eniugnaStoN}> on Wednesday November 19 2014, @10:06PM (#117871) Homepage Journal

        Why bother with encryption if you have no idea whether the other end of the encrypted pipe is your friend or a MITM?

        So that it's non-trivial for those who are sucking up huge volumes of data at IX nodes to analyze the contents of your network connections, maybe?

        There are many situations where sensitive information is *not* being transmitted (e.g., browsing on SN). In such cases, identity (as distinct from encryption) isn't really important. Aside from corporate proxy servers and, perhaps, ISPs MITM'ing all network *data* (as opposed to reading/rewriting packet headers) traversing their network (I don't have any statistics on this, do you?). That seems really unlikely, however.

        Accept encryption w/o verification and your own ISP will MITM you on the first hop so they can shape your traffic.

        Yes, that is absolutely an issue (in fact, I even mentioned it in my post -- reading what you reply to can be *so* informative, if something of a chore). However, there are mechanisms which can, in theory, at least let you know when that's going on (a path check ala traceroute, perhaps). Although I suppose that your ISP could rewrite *all* DNS queries (and my suggestion, as I state clearly does nothing about DNS hijacking) to respond with the IP addresses of their own systems, that would be detected very quickly -- and users would have a huge fit. Also, they'd actually be *aware* that this is going on, rather than the ISPs doing so with unencrypted traffic with the end-user completely oblivious. In a corporate environment, that's much more problematic. . As I said (since you missed it the first time, apparently):

        Yes, this is easily broken by proxy servers or a determined MITM attack. However, it would be far more difficult to do so *transparently* -- giving the user more knowledge about who or what is inspecting their traffic.

        Also, I pointed out that:

        Perhaps my suggestion isn't the best one (it certainly has plenty of edge cases which would limit its efficacy significantly), or even a particularly good one, but it would better than what we have now. x509 certs/SSL keys are clearly not the best solution. Rather than throw up our hands, perhaps adding additional layers of potential encryption could help to thwart those who wish to spy on us.

        So. You believe that my suggestion isn't a good one. Do you have any better ideas? If so, let's hear them. I haven't thought through many of the implications and challenges of such an implementation. in fact, it was a passing thought that I decided to share in the hope of generating a technical discussion about encryption that doesn't rely on centralized authorities.

        An inauspicious start to the discussion. Hopefully others will have more constructive things to add.

        --
        No, no, you're not thinking; you're just being logical. --Niels Bohr
        • (Score: 3, Insightful) by snick on Thursday November 20 2014, @12:57AM

          by snick (1408) on Thursday November 20 2014, @12:57AM (#117931)

          My point wasn't that accepting encryption w/o verification would make MITM possible. It was that it would make MITM _universal_.

          ISPs would routinely decrypt/analyse /re-encrypt all traffic. They would transmit your traffic internally (and between themselves and "trusted" partners) in the clear and only do SSL at the edges.

          This will be the practice for _all_ SSL traffic. (not just for connections being attacked by malicious third parties)

          This would be much worse than no SSL because you think that your pipe is encrypted, but it will only be encrypted for the first and last hop. Between those, all your traffic will be on parade.

          • (Score: 1) by NotSanguine on Thursday November 20 2014, @02:31AM

            by NotSanguine (285) <reversethis-{grO ... a} {eniugnaStoN}> on Thursday November 20 2014, @02:31AM (#117952) Homepage Journal

            My point wasn't that accepting encryption w/o verification would make MITM possible. It was that it would make MITM _universal_.

            Please give me an example of an ISP that routinely uses transparent proxies to intercept encrypted communications and then uses that data for the purposes you suggest.

            What is more, when encryption is done at the network layer (ala IPSec), the volumes of traffic (how do you cache VOIP traffic or a live videoconference on a caching proxy?) make such activity extremely costly and, possibly, prohibitively so.

            This will be the practice for _all_ SSL traffic. (not just for connections being attacked by malicious third parties)

            This would be much worse than no SSL because you think that your pipe is encrypted, but it will only be encrypted for the first and last hop. Between those, all your traffic will be on parade.

            Right. Because all ISPs already do this. Not. Besides, I'm not talking about SSL/TLS. Its system resource overhead is too high to use for ubiquitous encryption of *all* traffic. Even DNS traffic, which would make rewriting DNS responses more difficult and thwart attempts by ISPs (or others) to redirect traffic. And, like I said. as long as you set this up correctly, attempts by intermediate actors to proxy the encryption would be immediately obvious.

            All that said, what do *you* think is a good solution for ubiquitous encryption across the Internet?

            --
            No, no, you're not thinking; you're just being logical. --Niels Bohr
    • (Score: 2) by frojack on Wednesday November 19 2014, @11:41PM

      by frojack (1554) on Wednesday November 19 2014, @11:41PM (#117906) Journal

      IPSec is supported natively in IPv6 and implementations for IPv4 are available for just about any platform you'd care to name. If a protocol were to utilize the IP address/port combination (in addition to other characteristics) in generating encryption keys for IPSec connections, with an additional path check to (at least attempt) to verify that the site which claims to host a specific IP address is on the network it claims to be, could provide widespread encryption with minimal impact.

      You seem to be looking to the transport layer for protection. The classic model suggests that's not the right place, but for the purposes you've outlined, it appears that it might work "good enough" as a starter approach. (Yes, easy to change your mac and therefor your ipv6/4 address, but hey, its better than nothing).

      I like the additional path check idea, but we sort of already have that in the PGP encrypted email arena. All I need to send you encrypted email is your email address, and your public key which I can fetch from keyservers.

      Why can't something similar be used for web servers? I encrypt my web request with Soylent New's widely published public key, and they send me web pages encrypted with my widely published public key. (Note: Keyservers currently work to provide a public key for an email address, but that is not the only way they can work. They can be set up to provide a public key for any random string such as soylent+newsisbetterthan-that-green-site.

      You may think it would be easy to corrupt the whole keyserver system, but, because so much would depend on it, you couldn't do it surreptitiously. Everybody would know it happened.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 1) by NotSanguine on Thursday November 20 2014, @02:20AM

        by NotSanguine (285) <reversethis-{grO ... a} {eniugnaStoN}> on Thursday November 20 2014, @02:20AM (#117950) Homepage Journal

        You seem to be looking to the transport layer for protection. The classic model suggests that's not the right place, but for the purposes you've outlined, it appears that it might work "good enough" as a starter approach. (Yes, easy to change your mac and therefor your ipv6/4 address, but hey, its better than nothing).

        I'm not trying to be glib, but please tell me about which "classic" model (with references, please) you're referring to -- I'm not sure I know what you're talking about.

        In any case, performing encryption at the application level is mostly a result of a fragmented approach to authentication/authorization/confidentiality, as in SSH or SSL/TLS since iPv4 had (until IPSec) no standard mechanisms for these tasks. As such, a number of applications implemented their own tools which are neither standardized or particularly efficient.

        Having strong encryption at the network layer is faster, smarter and much more versatile than a bunch of different application-level encryption tools/mechanisms. Which is, of course, one of the reasons why IPSec is native to IPv6.

        In my experience, IPSec transport mode is cheaper (resource-wise) to set up and tear down connections and is transparent to applications. Additionally, especially with IPv6, no additional software is required on any of the peers. What is more, if you want ubiquitous encryption (confidentiality, not identity or non-repudiation), why should that be done at the application layer?

        The potential for spoofing is certainly a big issue, but if you're trying make encryption ubiquitous as a mechanism for confidentiality (think speaking ancient greek to your friend while walking down the street -- does it guarantee that you won't be overheard and understood? No. But the vast majority would have no idea what you were discussing), it would seriously hamper those who want to scoop up every bit of information about us (assuming we don't just give it away through social media) through whatever means at their disposal.

        What is more, spoofing is an issue for many (most?) internet resources as they don't use any identity validation *or* encryption at all.

        Yes. It's better than nothing. It could be a *lot* better than nothing if we can identify a mechanism (PGP keys, as you noted, might be an avenue to explore, although, off the top of my head, there are a few issues with that idea) to reliably identify (not encrypt) endpoints.

        --
        No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 2) by fnj on Wednesday November 19 2014, @08:45PM

    by fnj (1654) on Wednesday November 19 2014, @08:45PM (#117842)

    Why does it take six months of huffing and puffing to get this started? Why not start the service NOW?

    • (Score: 1, Interesting) by Anonymous Coward on Wednesday November 19 2014, @09:36PM

      by Anonymous Coward on Wednesday November 19 2014, @09:36PM (#117862)

      Why does it take six months of huffing and puffing to get this started? Why not start the service NOW?

      Most likely the CA certificate has to be trusted by other browsers than just Mozilla.

  • (Score: 2) by KritonK on Thursday November 20 2014, @09:59AM

    by KritonK (465) on Thursday November 20 2014, @09:59AM (#118056)

    we need to move to a future where every website is HTTPS by default

    The way HTTPS works, it is not possible to use it correctly with virtual hosts [wikipedia.org]. The host name is transmitted encrypted, so the web server cannot know what that name is before it decrypts it. Thus, the web server cannot use a different SSL certificate for each of its virtual hosts, and has to limit itself to using only one, with which all HTTPS communication is decrypted. The virtual host, for which this one certificate was issued, will work fine with HTTPS, but you'll get incorrect certificate warnings for all the others, if you try accessing them via HTTPS.

    • (Score: 2) by TheLink on Thursday November 20 2014, @11:00AM

      by TheLink (332) on Thursday November 20 2014, @11:00AM (#118064) Journal

      Shouldn't be that long before it's viable: https://en.wikipedia.org/wiki/Server_Name_Indication#Support [wikipedia.org]

      • (Score: 2) by akinliat on Friday November 21 2014, @05:23PM

        by akinliat (1898) <akinliatNO@SPAMgmail.com> on Friday November 21 2014, @05:23PM (#118535)

        That's actually pretty much right now, then. The servers that support SNI are pretty much all of the ones that anyone would currently be setting up, and most of the ones that are currently operating (a statistic that I'm totally just making up, but I imagine it's true).

        As for clients, it looks like most browsers except IE6 and IE7 (surprise) -- oh and anything running on Windows XP. So another nail in the XP coffin, I suppose.

        What kills me is that I had to find out about this here. This is big news if you're hosting sites, because it finally kills the one-IP-address-per-SSL-certificate issue that makes hosting secure sites such a pain (IPv4 addresses are getting rare as hen's teeth). But does Apache do a big news event? Are there stories all over the tech news concerning this?

        No.

        I've been running servers that support this for almost two years, and I'm just finding this out. Maybe I need to get out more.

        • (Score: 0) by Anonymous Coward on Saturday November 22 2014, @05:00PM

          by Anonymous Coward on Saturday November 22 2014, @05:00PM (#118814)

          I've been running servers that support this for almost two years, and I'm just finding this out. Maybe I need to get out more.

          Nah, that's why you read stuff like SoylentNews.

          Or just google for: https multiple hosts single IP

          ;)