Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Thursday April 30 2015, @11:12PM   Printer-friendly
from the we're-not-really-just-procrastinating-honest! dept.

The Register covers the difficulty of putting SHA-1 crypto algorithm to bed:

The road towards phasing out the ageing SHA-1 crypto hash function is likely to be littered with potholes, security experts warn.

SHA-1 is a hashing (one-way) function that converts information into a shortened "message digest", from which it is impossible to recover the original information. This hashing technique is used in digital signatures, verifying that the contents of software downloads have not been tampered with, and many other cryptographic applications.

The ageing SHA-1 protocol – published in 1995 – is showing its age and is no longer safe from Collision Attacks, a situation where two different blocks of input data throw up the same output hash. This is terminal for a hashing protocol, because it paves the way for hackers to offer manipulated content that carries the same hash value as pukka packets of data.

Certificate bodies and others are beginning to move on from SHA-1 to its replacement, SHA-2. Microsoft announced its intent to deprecate SHA-1 in Nov 2013. More recently, Google joined the push with a decision to make changes in he latest version of its browser, Chrome version 42, so that SHA-1 certificates are flagged up as potentially insecure.

Just updating to SHA-2 is not as simple as it might seem, because of compatibility issues with Android and Windows XP. More specifically, Android before 2.3 and XP before SP3 are incompatible with the change (a fuller compatibility matrix maintained by digital certificate firm GlobalSign can be found here).

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Friday May 01 2015, @02:28PM

    by Anonymous Coward on Friday May 01 2015, @02:28PM (#177470)

    Is it impossible for a CA to sign a cert in more than one way?

    If so, can this be changed so that a site can have a cert signed with SHA-1, SHA-2 and SHA-3 (did that one get finalized yet?) and the browser can simply verify the most secure one it can handle. If none of the signatures are considered secure by the browser than it can be treated as a non-secure site ('I understand the risks').

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 1, Interesting) by Anonymous Coward on Friday May 01 2015, @07:51PM

    by Anonymous Coward on Friday May 01 2015, @07:51PM (#177584)

    I've thought about why we don't do this more often as well, just use multiple hashes simultaneously.

    I'd extend the above suggestion to not only check the against the most secure hash, but check _all_ hashes/signatures available (that the system is capable of doing so). Tampering the data in something (a certificate, file, whatever) then manipulating it to create a collision in multiple disparate, and perhaps sufficiently different, hash algorithms must be significantly more difficult than doing so in any single algorithm.

    I think Gentoo does this already when downloading source code for compiling, in addition to checking file sizes as well.

    Heck, we may consider MD5 and SHA1 insecure when used on their own today (dare I include CRC?), but what happens if you use all hashes at the same time and test against all algorithms, how secure will that be for detecting tampering, compared to using a single newer more complex hashing algorithm? Not to say we can't use the newer algorithms as well, I just think perhaps we can still use older algorithms in more intelligent ways to mitigate the risk of hash collisions in them.

    • (Score: 2) by Yog-Yogguth on Saturday May 02 2015, @02:04AM

      by Yog-Yogguth (1862) Subscriber Badge on Saturday May 02 2015, @02:04AM (#177725) Journal

      I like the idea as a concept on its own but it's deceptive and doesn't relate to the situation at hand as far as I can tell.

      On its own as a concept (i.e. ignoring the rest of this comment) it should work as far as I can tell but I'm not a cryptographer. It would be somewhat similar to/reminiscent of double hashing [wikipedia.org] except on a hash algorithm level. I would think (but don't know for sure) that independent salting of each should take care of any cryptographic problems (maintaining the one-way street nature of hashing). All of this requires more use of resources; both work (energy, time) and storage (so less space for storage with the same hardware), but those could be non-issues.

      It's not a completely backwards-compliant solution since in most situations where one could implement such a thing retroactively (i.e. without changing equipment) one is likely to instead be able to upgrade to a stronger hash algorithm (which is as good as certain to be more efficient than parallel multiple hashes).

      It could make the problems worse since instead of upgrading/changing some equipment you now have to change all the equipment to understand the new approach (since no equipment currently has it).

      There might not be any use cases left at that point (I don't know but suspect there isn't).

      Would it make sense instead as future-proofing against weaknesses in specific algorithms? Again I kind of think it would but I don't have the confidence to say for sure or whether the utility of it is worth the effort (I strongly suspect it /isn't/ worth it, there will always be some pain involved with old/“obsolete” systems). If it is worth it then it's worth it as an open-ended gradually introduced standard.

      There will always be collisions because the definition of hashing implies collisions (and this still applies). The reason for the fuss about SHA-1 is that it has a higher rate of collisions than “random” collisions. How important this actually is and whether other hashing functions could be shown to have even more collisions than that in the future is pretty much unknown.

      TL;DR: No, the situation as far as I understand it is an obsolete software/operating system problem that should be fixed “internally” by Google, Microsoft, and users of Google and Microsoft, and which is used by CA's as a poor excuse to drag their feet. It's not about embedded hardware or anything like that either.

      --
      Bite harder Ouroboros, bite! tails.boum.org/ linux USB CD secure desktop IRC *crypt tor (not endorsements (XKeyScore))