McAfee says that he and his team can break into the phone within three weeks. McAfee states his motive for the offer is because "he didn't want Apple to be forced to implement a 'back door'".
Bill Gates has apparently sided with the FBI in the dispute over the unlocking of a "specific" iPhone, breaking with other technology industry leaders:
Apple should comply with the FBI's request to unlock an iPhone as part of a terrorism case, Microsoft founder Bill Gates says, staking out a position that's markedly different from many of his peers in the tech industry, including Facebook founder Mark Zuckerberg. The two titans aired their views on what's become a public debate over whether Apple should be compelled to unlock an iPhone used by San Bernardino shooter Syed Rizwan Farook. "This is a specific case where the government is asking for access to information. They are not asking for some general thing, they are asking for a particular case," Gates told the Financial Times.
However, in a follow-up interview with Bloomberg, Gates said he was disappointed by reports (such as my original submission #2 below) that he had sided with the FBI in its legal dispute with Apple:
In an interview with Bloomberg, Bill Gates says he was "disappointed" by reports that he supported the FBI in its legal battle with Apple, saying "that doesn't state my view on this." Still, Gates took a more moderate stance than some of his counterparts in the tech industry, not fully backing either the FBI or Apple but calling for a broader "discussion" on the issues. "I do believe that with the right safeguards, there are cases where the government, on our behalf — like stopping terrorism, which could get worse in the future — that that is valuable." But he called for "striking [a] balance" between safeguards against government power and security.
[Continues.]
Since we keep talking about Apple versus the FBI, I thought I'd propose a simple solution to the problem, which as far as I can think would satisfy most parties...
The problem is getting access to a known terrorist's encrypted information. The question is whether Apple should threaten their own security, and the trust of their customers worldwide (as other states could demand the same for their "terrorists"), for what's likely to be an limited or insignificant chunk of data. Apple gets bad publicity regardless of the outcome.
Well, it turns out that we already pay some people to secretly do what Apple is being asked to do: our good old friends at the NSA. They're pretty good at cracking "Bad Guy" systems, and people know that. So my proposal is pretty simple:
1) Give the Terrorist's encrypted device to the NSA.
2) Let it be known that a Classified meeting happened at the NSA with Apple's security gurus.
3) The NSA "allocates proper resources to defend the country against a clear computer-based threat", performs its magic, and provides access to the phone for the FBI.
What's the point?
- Apple cannot reveal what the NSA requested to know to help open the phone. It's Classified, which is easily justified by Apple's security being important to the US.
- The NSA doesn't have to reveal whether they could have done it without Apple's help, and whether their solution is applicable to more than just that phone.
- Apple is not compelled to create software for the government just because a judge said so, and it also stops having to explain why it seemingly protects a terrorist's data.
- Apple can keep telling customers and other governments that it is not sure how to safely bypass the security. Should another government request similar information, they may get those details which are not protected by US regulations, and if that coincidentally isn't enough to also open a target's phone, it must have been that the NSA guys are really very very good.
- The FBI gets the data they requested (officially what they want) without further delays and lawyers.
Not only would both Apple and the FBI both get what they want despite the apparent incompatible goals, but the NSA would be the good guys for actually doing their job. Some people will argue that handing the secrets to the government is necessarily a bad thing. But the NSA doesn't share its recipes with other agencies, may already have those secrets anyway, and the security scheme on that phone was already superseded in newer device versions, limiting the potential for reuse.
What do Soylentils think?
Original Submission #1 Original Submission #2 Original Submission #3
(Score: 3, Insightful) by Whoever on Wednesday February 24 2016, @06:08PM
How would you feel if the court ordered you to spend some of your spare time building roads? Or, let's say you are a web developer and the government orders you to spend some time working on a government website? Why should the government be able to force anyone to build anything? This isn't a simple case of providing information that is already in Apple's possession.
(Score: 2) by bob_super on Wednesday February 24 2016, @06:40PM
> Why should the government be able to force anyone to build anything?
Technically, the government can force you to go fight someone, especially The_Evilz_Terruristz invading your country. Fighting by defeating their digital comms falls within the "wartime" things you'd be potentially compelled to assist with.
But that's not the Writs' Act.
(Score: 1, Informative) by Anonymous Coward on Wednesday February 24 2016, @08:20PM
The government isn't forcing me to fight anyone. Sure, as a conscientious objector, I won't get paid, get the shitty jobs and have a few more downsides, but the clean conscious is worth it.
Also, you should look up how the U.S. treats objectors, both in the past and their current regulations. Boy, the war machine really hates them.
(Score: 2) by khchung on Thursday February 25 2016, @12:05AM
How would you feel if the court ordered you to spend some of your spare time building roads?
Probably better than the court ordering you to build a backdoor in your home so the FBI can come in to investigate "a very specific case". Of course, you also knew that once the backdoor was built, it just makes it easy for the FBI to use it again and again.
PLUS, with this precedence, now you could look forward to building backdoors for all your future new homes, too.
(Score: 2) by Bogsnoticus on Thursday February 25 2016, @02:23AM
How would you feel if the court ordered you to spend some of your spare time building roads?
Your logic is flawed in regards to Apple being asked to "crack" the security via a custom firmware update, as it is something Apple do not do as a standard part of their business model.
Please explain, with a rational, logical argument, how this case is any different to Apple pushing out a custom firmware update to deal with the Error 53 Fingerprint Scanner issue?
Both involve Apple deliberately bypassing their own security to grant access to the contents of the phone.
At the moment, it seems to boil down to;
Customers demanding Apple bypass the security of their phone due to the customers decision to purchase dodgy hardware for their device = good
FBI asking for Apple to assist in bypassing the security of an iPhone that was used by a mass-murderer/terrorist = bad.
And I do not want to hear anything about it setting precedents, as Apple set their own precedent when they came up with the Error 53 solution.
Genius by birth. Evil by choice.
(Score: 3, Informative) by quacking duck on Thursday February 25 2016, @03:19AM
The fix for Error 53 did not bypass their own security to grant access to the contents of the phone. Error 53 was a failure to complete authorization with Apple's servers after installing an iOS update, due to an overly aggressive component security check.
The fix for error 53 is what *should* have happened if the TouchID sensor is "tampered" with: Touch ID and related functions e.g. fingerprint login and ApplePay remains disabled, until taken to an Apple store to pair the components again. The rest of the phone remains usable and remains secure at the basic level.
TouchID is not the primary means of device access, it's just the most convenient. The passcode/phrase is still the final say, since you can't use TouchID after restarting the iPhone, or to initiate an iOS update, or if you haven't used TouchID for 48 hours.
The fix involves restoring the device from iTunes, after which... you You still need to enter a passcode. This is the case with *any* iOS update.
So no, they have not "bypassed their own security to grant access to the contents of the phone," that is a gross misrepresentation of what the fix actually does.
(Score: 3, Interesting) by Bogsnoticus on Thursday February 25 2016, @03:33AM
Thank you for the clarification.
When I've asked on other forums, I received very flakey, and often frothy-mouthed rants with very little actual facts in them.
Until this situation, I paid little attention to Apple security features, as I'm not an iThingy owner, and dont have to support any in my circle of family or friends.
I stand corrected, and informed. *tips hat*
Genius by birth. Evil by choice.
(Score: 2) by Whoever on Thursday February 25 2016, @04:34AM
Simple. One is done to maintain or increase sales and the other is likely to reduce sales.
In other words, you are not really interested in reading about anything that contradicts your world view. That explains quite clearly what type of person you are.
(Score: 2) by quacking duck on Thursday February 25 2016, @02:23PM
To be fair to to Bognosticus, I challenged him at length on this [soylentnews.org], and he acknowledged that he stood corrected.
I am grateful there are still people willing to consider corrections and publicly acknowledge their error, rather than doubling down on their initial strong stance based on incomplete information.
(Score: 2) by Anal Pumpernickel on Thursday February 25 2016, @05:17AM
Please explain, with a rational, logical argument, how this case is any different to Apple pushing out a custom firmware update to deal with the Error 53 Fingerprint Scanner issue?
Choice, for one. Apple chose to do that themselves.
And I do not want to hear anything about it setting precedents, as Apple set their own precedent when they came up with the Error 53 solution.
What does Apple's past action have to do with government precedent? Nothing. The consequences of the two are far too different, and bad court precedent can have terrible societal consequences.