Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Wednesday March 14 2018, @02:16PM   Printer-friendly
from the simple-cypers dept.

Arstechnica reports

In July of 2017, the nonprofit certificate authority Let's Encrypt promised to deliver something that would put secure websites and Web applications within reach of any Internet user: free "wildcard" certificates to enable secure HTTP connections for entire domains. Today, Let's Encrypt took that promised service live, in addition to a new version of the Automated Certificate Management Environment (ACME) protocol, an interface that can be used by a variety of client software packages to automate verification of certificate requests.

[....]Many hosting providers already support the registration of Let's Encrypt certificates to varying degrees. But Let's Encrypt's free certificate offering hasn't been snapped up by some larger hosting providers—such as GoDaddy—who also sell SSL certificates to their customers.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by NotSanguine on Wednesday March 14 2018, @09:15PM (6 children)

    In more than one scenario I deal with, the certificates are installed on a closed engineering network without Internet access so any certificate deployment needs to be manual

    If it's not exposed to the internet, just use self-signed certs. That's not the use case that Let's Encrypt tries to address.

    Let's Encrypt's advantage is that is that browsers and other *clients* on the internet will trust Let's Encrypt certs.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 4, Informative) by zocalo on Wednesday March 14 2018, @09:58PM (5 children)

    by zocalo (302) on Wednesday March 14 2018, @09:58PM (#652625)
    You do realise that closed networks still tend to use many of the same kinds of clients that the Internet does, right? Especially, things like web browsers which are getting more and more likely to throw a warning or error when they encounter a self-signed cert, and are making it increasingly hard for the end user to just click past the issue (which is a good thing, to be clear). That means you figure out some means of telling all your various clients that they need to trust your self-signed certificates. All of which is change, which requires a great deal of administrative overhead and process to put in place if it's not already "the done thing" because you don't just make changes to life-safety critical systems; it takes a whole load of testing and sign-off (often from multiple stakeholders) first, even if the actual change itself is fairly routine in IT terms.

    Again, it's not the cost, it's the effort required to effect the change and manage the process. The least effort comes from certificates that are already trusted by the clients (e.g. from a well known CA - typically someone like Thawte), have a multi-year lifespan, and require as little in the way of additional PKI infrastructure, custom scripts, or manual processes as possible.
    --
    UNIX? They're not even circumcised! Savages!
    • (Score: 2) by NotSanguine on Wednesday March 14 2018, @10:36PM

      Absolutely.

      I was focusing my reply (and incorrectly in your case -- my apologies) on the use case where *all* clients and servers were members of the same organization. In that use case, everyone (including the various client applications) should already trust the internal CA certificate.

      Once you bring in users whose devices/applications that are not owned/managed by the same organization, that goes right out the window.

      For your use case, a widely trusted cert is likely not a bad idea.

      However, if resources on the servers are sensitive, encryption isn't the only concern even if external parties are involved. If you need to do client *authentication* as well as encryption, Let's Encrypt would be much inferior to an internal CA and self-signed server *and* client certs, despite the issues around distribution and maintenance of the client certs.

      There are ways around such a management issue of course (VNC/RDP access from internal hosts, etc.).

      --
      No, no, you're not thinking; you're just being logical. --Niels Bohr
    • (Score: 0) by Anonymous Coward on Thursday March 15 2018, @07:40AM

      by Anonymous Coward on Thursday March 15 2018, @07:40AM (#652818)
      If real security is required it's better to disable the "Depend on hundreds of CAs to never screw up" and rely on self signed certificates, or only certs signed by your internal CA.
    • (Score: 2) by TheRaven on Thursday March 15 2018, @08:39AM (2 children)

      by TheRaven (270) on Thursday March 15 2018, @08:39AM (#652840) Journal
      There are really only two alternatives:

      Option 1: The network is only used by a fixed set of clients, so you can push out your signing cert to all of them easily.

      Option 2: The network is used by clients that move from other networks to it. In this case, the air gap doesn't really buy you any security, because malware can infect one of the clients from the public network and can spread to your private network. In this case, you may as well set up a DMZ to push Let's Encrypt certs into the network.

      --
      sudo mod me up
      • (Score: 0) by Anonymous Coward on Thursday March 15 2018, @09:04AM

        by Anonymous Coward on Thursday March 15 2018, @09:04AM (#652849)

        For your regular malware, you are right. However, there are other ways to prevent those (updates, not running as admin).

        A directed attack, on the other hand, is a lot easier if you have two way communication, and can attack one layer at a time, rather than needing to prepare everything ahead of time to sneak your attack in. It can be done - Stuxnet is an example - but your regular ADHD teen out to do some random damage is not going to take the time to find out the exact structure of the layers of security before even starting to write their attack code.

      • (Score: 2) by zocalo on Thursday March 15 2018, @11:11AM

        by zocalo (302) on Thursday March 15 2018, @11:11AM (#652891)
        I'd agree with option one on a general purpose network, engineering networks like these, not so much. *Every* change, no matter how minor or how many times it's been done before, needs paperwork, risk assessments, and approvals. "I want to replace an expired TLS certificate" = one set. "I want to push a new CA to all clients" (some of which are proprietary tools and may not support a script based install) = another set, and so on.

        Yes, a lot of the clients are fairly easy to push a new CA to, others just need a current cert and don't actually validate the entire trust chain so they're easy too (and could be selfsigned), the real PITAs are the propriatary tools that are at least partly managed by the vendor and don't tend to make the process easy *and* make people in management very twitchy because you're proposing a change to something they don't really understand but know that it's very expensive and very mission critical. Again, it's all about the paperwork and potential risk, not the cost, effort, or level of expertise, involved. The easiest and least frequent hoop to jump through is to standardise on a set of commercial certs from a widely recognised vendor that will work across every client and server and only require updating as infrequently as possible.
        --
        UNIX? They're not even circumcised! Savages!