Apple has been ordered to assist in the unlocking of an iPhone belonging to one of the San Bernardino shooters. This may require updating the firmware to bypass restrictions on PIN unlock attempts:
Apple must assist the FBI in unlocking the passcode-protected encrypted iPhone belonging to one of the San Bernardino shooters in California. US magistrate Sheri Pym says Cupertino must supply software that prevents the phone from automatically annihilating its user data when too many password attempts have been made.
The smartphone belonged to Syed Farook, who with his wife Tashfeen Malik shot and killed 14 coworkers on December 2. The couple died in a gun battle with police soon after. Cops have been unable to access Syed's iPhone 5C because they do not know the correct PIN, and will now gain the assistance of Apple, as ordered by Judge Pym [PDF] on Tuesday.
iOS 8 and above encrypts data on devices, requiring a four to six-digit PIN to unlock. After the first few wrong guesses, iOS waits a few minutes between accepting further PIN entry attempts, escalating to an hour's delay after the ninth failed login.
[...] Judge Pym wants Apple to come up with some magic software – perhaps a signed firmware update or something else loaded during boot-up – that will allow the FBI to safely brute-force the PIN entry without the device self-destructing. This code must only work on Farook's phone, identified by its serial numbers, and no other handset. The code must only be run on government or Apple property, and must not slow down the brute-forcing process.
Apple has five days to appeal or demonstrate that it cannot comply with the order. It is crucial to note that the central district court of California has not instructed Apple to crack its encryption – instead it wants Apple to provide a tool to effectively bypass the unlocking mechanism. "It's technically possible for Apple to hack a device's PIN, wipe, and other functions. Question is can they be legally forced to hack," said iOS security expert Jonathan Ździarski.
According to this Reuters article, "A U.S. judge on Tuesday ordered Apple Inc to help the FBI break into a phone recovered from one of the San Bernardino shooters, an order that heightens a long-running dispute between tech companies and law enforcement over the limits of encryption.
Apple must provide "reasonable technical assistance" to investigators seeking to unlock the data on an iPhone 5C that had been owned by Syed Rizwan Farook, Judge Sheri Pym of U.S. District Court in Los Angeles said in a ruling."
"...Forensics expert Jonathan Zdziarski said Tuesday Apple might have to write custom code to comply with the order, presenting a novel question to the court about whether the government could order a private company to hack its own device.
Zdziarski said that because the San Bernardino shooting was being investigated as a terrorism case, investigators would be able to work with the NSA and CIA on cracking the phone. Those U.S. intelligence agencies likely could break the iPhone's encryption without Apple's involvement, he said."
Update: EFF to file an amicus brief in support of Apple's position.
Update 2: mendax writes: The New York Times has some "breaking news" which says that Apple will not comply with the judge's order. It's a good way to get in trouble with the judge but it's the right decision on Apple's part.
Previously: FBI Unable to Decrypt California Terrorists' Cell Phone
(Score: 2) by Immerman on Wednesday February 17 2016, @04:49PM
In fact open source might make it easier to compromise, making it far easier to write your own "no limits" version of the OS. Then the only defense would be "tivoization" so that the device would only run a properly signed OS so that cooperation with the manufacturer is still required (or at least a copy of their signing key).
Of course, if the device doesn't require a signed OS then any halfway competent attacker should be able to edit the binary directly to remove the limitations. So presumably iPhones require signed binaries and the FBI hasn't yet acquired the keys. [Dons tinfoil hat] Or at least that's what the FBI wants the public to think.
Hmm, I suppose a signing requirement would also be needed for limiting the compromised software to a single device - hard-code the serial number check and even though it's easy to modify for another phone, the modified version will no longer be signed.
(Score: 0) by Anonymous Coward on Thursday February 18 2016, @03:47PM
In fact open source
Not merely open source, but free software. As for this, it's entirely possible to have free software that implemented similar security measures that is fully in control of the user. There is no reason the manufacturer would have to be involved.
It is silly to think that hiding how the software works will somehow protect you from competent attackers.
(Score: 3, Informative) by Immerman on Thursday February 18 2016, @09:40PM
Is not Free software a strict subset of open source? I.e. all Free Software is open source, but not all open source software is Free.
And no, it's not entirely silly - if you have access to the source code their are a number of tools of various degrees of sophistication that you can use to analyze it for likely security problems. Probably a fair bet that competent attackers will do so using the most sophisticated tools available (especially if we're talking NSA-class attackers). Probably also a fair bet that most OS projects won't run such high-end analytics themselves, nor immediately fix all the problems if they do.
Open source can do a great job of eliminating a lot of the "low hanging fruit" for attackers. But as we've sen time and again it doesn't necessarily catch the more subtle problems. Meanwhile it helps to expose those subtle problems that would likely be difficult to find through black-box analysis to well-funded attackers.
Net result - your average security-conscious open source program is probably more secure against average attackers than a proprietary equivalent. But once you eliminate the low hanging fruit on both, then having the source gives you a leg up on finding more esoteric attacks. Not to mention it may make it more likely that an attacker will intentionally "poison the well" by contributing an obfuscated weakness. With proprietary software that can only be done with inside help - not that that's any sort of guarantee it doesn't happen, but it requires conspiracy rather than just the false appearance of good faith.