NIST calls time on SHA-1, sets 2030 deadline:
The US National Institute of Standards and Technology (NIST) says it's time to retire Secure Hash Algorithm-1 (SHA-1), a 27-year-old weak algorithm used in security applications.
"We recommend that anyone relying on SHA-1 for security migrate to SHA-2 or SHA-3 as soon as possible," said NIST computer scientist Chris Celi, in a canned statement on Thursday.
As soon as possible isn't necessarily all that soon: NIST says you should be rid of SHA-1 from your software and systems by December 31, 2030. Meanwhile, the tech industry has largely moved on already.
[...] Despite its known weakness, SHA-1 has shown up in recent years propping up legacy applications and providing shoddy password storage. Microsoft finally got around to dropping SHA-1 from the Windows update process in August 2020.
[...] Celi explains that modules still using SHA-1 after 2030 will be ineligible for purchase by the federal government. Having eight years to submit an update may seem like more than enough time, but Celi warns there may be a backlog of submissions as the deadline nears. Developers wishing to avoid a potential validation delay should submit revised code sooner rather than later.
Related Stories
US NIST Unveils Winning Encryption Algorithm for IoT Data Protection
The National Institute of Standards and Technology (NIST) announced that ASCON is the winning bid for the "lightweight cryptography" program to find the best algorithm to protect small IoT (Internet of Things) devices with limited hardware resources:
Small IoT devices are becoming increasingly popular and omnipresent, used in wearable tech, "smart home" applications, etc. However, they are still used to store and handle sensitive personal information, such as health data, financial details, and more.
That said, implementing a standard for encrypting data is crucial in securing people's data. However, the weak chips inside these devices call for an algorithm that can deliver robust encryption at very little computational power.
"The world is moving toward using small devices for lots of tasks ranging from sensing to identification to machine control, and because these small devices have limited resources, they need security that has a compact implementation," stated Kerry McKay, a computer scientist at NIST.
[...] ASCON was eventually picked as the winner for being flexible, encompassing seven families, energy efficient, speedy on weak hardware, and having low overhead for short messages.
NIST also considered that the algorithm had withstood the test of time, having been developed in 2014 by a team of cryptographers from Graz University of Technology, Infineon Technologies, Lamarr Security Research, and Radboud University, and winning the CAESAR cryptographic competition's "lightweight encryption" category in 2019.
More info at the algorithm's Website and the technical paper submitted to NIST in May 2021.
Related:
- NIST Drafts Revised Guidelines for Digital Identification in Federal Systems
- NIST Calls Time on SHA-1, Sets 2030 Deadline
(Score: 3, Funny) by bzipitidoo on Monday December 19, @03:01AM (8 children)
Large organizations can be incredibly slow to update. I really don't know what their deal is. They're super conservative about software updates, and I can understand that. But to stick with stuff that was cracked as much as a decade or more ago and for which there's been a good replacement available for at least the past 5 years, I regard as not being conservative, that's lazy, bureaucratic sluggishness.
My own anecdote about that was the time I was given access to some government data. They insisted on telephoning me to tell me the password, because that was more "secure" than email. I suggested they encrypt the email if they were so worried about it, but they didn't seem to understand what that meant. Okay, fine, tell me the account info over the phone, and hope it isn't tapped. Then came the kicker. The account info was for ... telnet! Nope, they didn't have ssh set up. ssh had been available for 7 years at that point, but they were still using plain old telnet. SMH.
(Score: 2) by coolgopher on Monday December 19, @03:49AM
😂🤦♀️
(Score: 0) by Anonymous Coward on Monday December 19, @10:25AM
The insistance on the use of insecure things like telnet to access secure sites/installations back then might have looked insane, but factor in the fact that it's a trivial task for them to then implement monitoring of all that lovely unencrypted traffic, it starts making sense, albeit slightly warped sense...
Used to have this issue with a certain 'unclear' institute in France back in the day - no encrypted traffic allowed in-out of the site, I had to re-enable telnet at our site so that our researchers could check their email using pine, and set up the firewall to only allow inbound/outbound telnet traffic to/fro their netblocks.
(Score: 0) by Anonymous Coward on Monday December 19, @01:29PM
Sometimes one of the issues that occurs is a chicken/egg type of problem where neither side wants to commit first to the development of the support for the change. The other is that until it becomes an actual compliance issue, some companies don't want to spend any development time/effort on something that doesn't have any return on investment. They'll only do it if someone else pays them to or compliance forces them to.
(Score: 2) by ElizabethGreene on Monday December 19, @03:56PM (4 children)
I have on my calendar today a meeting to discuss modernizing apps off of (no shit, honest to god) Windows NT and 2000. It's been decades, guys. Decades plural. Damn.
(Score: 1) by pTamok on Monday December 19, @07:34PM (3 children)
First law of rural motor mechanics applies: "If it ain't broke, don't fix it."
If the systems are not connected to the Internet, or to a network connected to the Internet, and are meeting the requirements, there may well be no reason to migrate. Of course if the hardware fails, you need backups...
If you can achieve what you need in a VM running NT or 2K, that may be a way to go. Banks still use COBOL programs written decades ago. Just because it is old, doesn't mean it should be replaced.
(Score: 3, Interesting) by ElizabethGreene on Tuesday December 20, @03:52AM (2 children)
I agree with this in many aspects. For example, a fifty-year-old oil well pump jack in the middle of a field is obsolete, does an important job, and it's not obviously necessary to replace it unless it breaks in a way that isn't easy repairable. It's discrete, does a clearly defined job, and doesn't create failure points for other technologies.
In IT it gets a lot fuzzier. A Windows 2003 system that's joined to a domain is an eight-year-old kid's lemonade stand in the middle of times square. Yes, it sells lemonade. Despite that, it probably shouldn't be there because it's going to get exposed to something it shouldn't [foxnews.com]. When that happens, it's not going to impact just that lemonade stand, it will impact everyone.
Putting myself in the shoes of an attacker, if I find a 2003 machine, I know I've got a laundry list of exploits ready to go. It hasn't been patched in, optimistically, 7 years more realistically, 10 years. I can run through the CVE-2013/2014-* RCEs and I'll hit on one in no time.
(Score: 1) by pTamok on Tuesday December 20, @10:30AM (1 child)
Well, yes, which is why I said "If the systems are not connected to the Internet, or to a network connected to the Internet...". Remotely accessible devices need good security. To be fair, anything like that should also be protected from public access to avoid people taking easy advantage of physical access.
There are expensive machines that use WinNT and Win2K PCs as front-ends to drive some industrial processes that are reliant on old technology interfaces where it is just not possible to replace the OS and hardware. It's difficult to make stuff 'future proof' over periods of decades, so I'm not saying people 'should have known', but I guess some industrial design department could do with a work unit on interfaces and protocols that last. It's not for nothing that the lowest common denominator for communication with devices is two wire (or three, with ground) async serial interfaces. Making yourself dependent on windows device drivers for old versions of windows using obsolete interface hardware isn't ideal.
(Score: 2) by ElizabethGreene on Tuesday December 20, @06:16PM
I'm with you on this. It would be a good practice for people building kit like that today to use interfaces they can reasonably trust will be around for decades, e.g. USB or Ethernet. In the case of USB, it should present as a raw serial device. Making it a 'special' device, i.e. implementing custom device profiles that would require a manufacturer-specific driver is a mistake unless you want to be in the business of publishing that driver for decades. For ethernet, again the rule is "no fancy stuff". Implement over IP; there's no reason to invent your own protocol. :)
We live in interesting times.
(Score: 2) by Opportunist on Monday December 19, @06:30AM
Quite a few standards already ban SHA1 from use and have done so for almost half a decade now.
(Score: 1) by pTamok on Monday December 19, @07:51PM
SHA-1 is still a perfectly good hash, it 'just' shouldn't be used for security critical purposes. There may well be better hash functions for other use-cases, but MD5 is still used for quick-n-dirty file comparisons* in deduplicating programs, so I'm sure there are still things SHA-1 can be used for.
What NIST said was "We recommend that anyone relying on SHA-1 for security migrate to SHA-2 or SHA-3 as soon as possible,"
*If the MD5 hashes of a file are the same, most such programs give the option of doing a bit-for-fit comparison of the files, just to make sure you haven't found a collision.