Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Thursday April 30 2015, @11:12PM   Printer-friendly
from the we're-not-really-just-procrastinating-honest! dept.

The Register covers the difficulty of putting SHA-1 crypto algorithm to bed:

The road towards phasing out the ageing SHA-1 crypto hash function is likely to be littered with potholes, security experts warn.

SHA-1 is a hashing (one-way) function that converts information into a shortened "message digest", from which it is impossible to recover the original information. This hashing technique is used in digital signatures, verifying that the contents of software downloads have not been tampered with, and many other cryptographic applications.

The ageing SHA-1 protocol – published in 1995 – is showing its age and is no longer safe from Collision Attacks, a situation where two different blocks of input data throw up the same output hash. This is terminal for a hashing protocol, because it paves the way for hackers to offer manipulated content that carries the same hash value as pukka packets of data.

Certificate bodies and others are beginning to move on from SHA-1 to its replacement, SHA-2. Microsoft announced its intent to deprecate SHA-1 in Nov 2013. More recently, Google joined the push with a decision to make changes in he latest version of its browser, Chrome version 42, so that SHA-1 certificates are flagged up as potentially insecure.

Just updating to SHA-2 is not as simple as it might seem, because of compatibility issues with Android and Windows XP. More specifically, Android before 2.3 and XP before SP3 are incompatible with the change (a fuller compatibility matrix maintained by digital certificate firm GlobalSign can be found here).

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Yog-Yogguth on Saturday May 02 2015, @02:04AM

    by Yog-Yogguth (1862) Subscriber Badge on Saturday May 02 2015, @02:04AM (#177725) Journal

    I like the idea as a concept on its own but it's deceptive and doesn't relate to the situation at hand as far as I can tell.

    On its own as a concept (i.e. ignoring the rest of this comment) it should work as far as I can tell but I'm not a cryptographer. It would be somewhat similar to/reminiscent of double hashing [wikipedia.org] except on a hash algorithm level. I would think (but don't know for sure) that independent salting of each should take care of any cryptographic problems (maintaining the one-way street nature of hashing). All of this requires more use of resources; both work (energy, time) and storage (so less space for storage with the same hardware), but those could be non-issues.

    It's not a completely backwards-compliant solution since in most situations where one could implement such a thing retroactively (i.e. without changing equipment) one is likely to instead be able to upgrade to a stronger hash algorithm (which is as good as certain to be more efficient than parallel multiple hashes).

    It could make the problems worse since instead of upgrading/changing some equipment you now have to change all the equipment to understand the new approach (since no equipment currently has it).

    There might not be any use cases left at that point (I don't know but suspect there isn't).

    Would it make sense instead as future-proofing against weaknesses in specific algorithms? Again I kind of think it would but I don't have the confidence to say for sure or whether the utility of it is worth the effort (I strongly suspect it /isn't/ worth it, there will always be some pain involved with old/“obsolete” systems). If it is worth it then it's worth it as an open-ended gradually introduced standard.

    There will always be collisions because the definition of hashing implies collisions (and this still applies). The reason for the fuss about SHA-1 is that it has a higher rate of collisions than “random” collisions. How important this actually is and whether other hashing functions could be shown to have even more collisions than that in the future is pretty much unknown.

    TL;DR: No, the situation as far as I understand it is an obsolete software/operating system problem that should be fixed “internally” by Google, Microsoft, and users of Google and Microsoft, and which is used by CA's as a poor excuse to drag their feet. It's not about embedded hardware or anything like that either.

    --
    Bite harder Ouroboros, bite! tails.boum.org/ linux USB CD secure desktop IRC *crypt tor (not endorsements (XKeyScore))
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2