Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday June 10 2019, @11:48AM   Printer-friendly
from the we'll-see dept.

Submitted via IRC for SoyCow4463

The clever cryptography behind Apple's "Find My" feature

When Apple executive Craig Federighi described a new location-tracking feature for Apple devices at the company's Worldwide Developer Conference keynote on Monday, it sounded—to the sufficiently paranoid, at least—like both a physical security innovation and a potential privacy disaster. But while security experts immediately wondered whether Find My would also offer a new opportunity to track unwitting users, Apple says it built the feature on a unique encryption system carefully designed to prevent exactly that sort of tracking—even by Apple itself.

In upcoming versions of iOS and macOS, the new Find My feature will broadcast Bluetooth signals from Apple devices even when they're offline, allowing nearby Apple devices to relay their location to the cloud. That should help you locate your stolen laptop even when it's sleeping in a thief's bag. And it turns out that Apple's elaborate encryption scheme is also designed not only to prevent interlopers from identifying or tracking an iDevice from its Bluetooth signal, but also to keep Apple itself from learning device locations, even as it allows you to pinpoint yours.

"Now what's amazing is that this whole interaction is end-to-end encrypted and anonymous," Federighi said at the WWDC keynote. "It uses just tiny bits of data that piggyback on existing network traffic so there's no need to worry about your battery life, your data usage, or your privacy."

[...] That system would obviate the threat of marketers or other snoops tracking Apple device Bluetooth signals, allowing them to build their own histories of every user's location. "If Apple did things right, and there are a lot of ifs here, it sounds like this could be done in a private way," says Matthew Green, a cryptographer at Johns Hopkins University. "Even if I tracked you walking around, I wouldn't be able to recognize you were the same person from one hour to the next."

In fact, Find My's cryptography goes one step further than that, denying even Apple itself the ability to learn a user's locations based on their Bluetooth beacons. That would represent a privacy improvement over Apple's older tools like Find My iPhone and Find Friends, which don't offer such safeguards against Apple learning your location.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by AthanasiusKircher on Monday June 10 2019, @11:59AM (11 children)

    by AthanasiusKircher (5291) on Monday June 10 2019, @11:59AM (#853644) Journal

    The only way to truly know that such a feature works as advertised, has no backdoors, doesn't track, doesn't retain info it claims not to, etc. would be the ability to audit the source code.

    Unfortunately, of course, Apple is not likely to release that. So, no, I don't have any good reason to believe a word they say. Perhaps it's true. Of the megacorps, Apple has a somewhat better apparent record in consumer privacy. But I say "apparent," because there are always a lot of unknowns in closed-source software.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 0, Disagree) by Anonymous Coward on Monday June 10 2019, @12:09PM (3 children)

    by Anonymous Coward on Monday June 10 2019, @12:09PM (#853649)
    • (Score: 2) by opinionated_science on Monday June 10 2019, @12:20PM (2 children)

      by opinionated_science (4031) on Monday June 10 2019, @12:20PM (#853651)

      if it is that good (I was out the other night and someone used it, so it seems very effective), can we have it as default on Android?

      Surely, this benefits everyone?
      Apple, as they can use the *huge* android market to find their phones.
      Google, as the feature parity gives them some street cred ;-)

      I've only used the android "find phone feature" using my Android watch (LG Sport), so I have no idea how it compares to Apples.

      Anyone have some data on this?

      • (Score: 0) by Anonymous Coward on Monday June 10 2019, @12:34PM

        by Anonymous Coward on Monday June 10 2019, @12:34PM (#853655)

        Methodically correct chain of trust between devices owned by single user is the main reason why I use apples for critical operations, like banking. No other platform can achieve that level of security so far.

      • (Score: 3, Funny) by mhajicek on Monday June 10 2019, @03:01PM

        by mhajicek (51) on Monday June 10 2019, @03:01PM (#853700)

        We'll, to be fair, you have to compare Apples to Apples.

        --
        The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
  • (Score: 3, Interesting) by JoeMerchant on Monday June 10 2019, @12:46PM (1 child)

    by JoeMerchant (3937) on Monday June 10 2019, @12:46PM (#853659)

    The only way to truly know that such a feature works as advertised, has no backdoors, doesn't track, doesn't retain info it claims not to, etc. would be the ability to audit the source code.

    Even then, you're not really sure - all that you know is that the publicly publishing community has or has not found an exploitable flaw, yet.

    Unfortunately, of course, Apple is not likely to release that.

    Depends, are you interested in feeling sure because you trust the open security community more, or are you interested in actual security, because keeping the source code and algorithm design documentation "secret" does significantly reduce the attack surface and increase the time it will take to find a flaw.

    Even if Apple did release the full algorithm design documentation, source code, etc. that's no guarantee that exploitable flaws and even intentional back doors don't exist elsewhere in the system.

    I'm not a fanboi, and when any company touts the "clever security" of their proprietary privacy protection algorithms, I don't get much of a warm fuzzy at all from that - maybe better than using systems like e-mail that are known open channels. In my opinion, the best security comes from layers - use a well documented "secure as we know how" layer, or two, then also use a few layers of weird stuff that people are going to have to figure out from scratch rather than using well known exploits from the standard toolkits. Odds are, your secrets aren't worth the effort, and if your secrets are worth the effort, there are always cheaper ways to get them [xkcd.com].

    --
    🌻🌻 [google.com]
    • (Score: 3, Informative) by AthanasiusKircher on Monday June 10 2019, @01:25PM

      by AthanasiusKircher (5291) on Monday June 10 2019, @01:25PM (#853665) Journal

      Depends, are you interested in feeling sure because you trust the open security community more, or are you interested in actual security

      Obviously you make good points. I know there have been and always will be security holes in open-source too. But I resent the dichotomy created between open sources and "actual security," as if the latter is such a thing. There are various ways to attain security, and yes, if you want the best possible security, you should be smarter than anyone else and write your own closed-source software yourself that only you know about and which you always audit perfectly for any flaws.

      Of course, in the real-world there are trade-offs. No one person or organization can find all flaws. Do I believe that if Apple AND the open-source community look for flaws that they'd be more likely to find them than if only Apple's security folks were looking for them? Probably. Does having open source create easier vectors for attack? Perhaps.

      But my post wasn't just about security. It was about Apple's claims about anonymity, no tracking, etc. Sorry, but there have been enough companies in the past years that have claimed "we're not tracking you or keeping your data, etc." but it turns out they were in various ways, so that I'm not going to trust THAT aspect without a public audit of the source code.

      TL;DR - My post was more about privacy implications than security per se.

  • (Score: 5, Informative) by EvilSS on Monday June 10 2019, @04:47PM (3 children)

    by EvilSS (1456) Subscriber Badge on Monday June 10 2019, @04:47PM (#853749)
    You still can't verify that the audited code is the production code. At some point, some level of trust is required if you are relying on a third party to do anything for you.
    • (Score: 2) by AthanasiusKircher on Monday June 10 2019, @09:35PM (2 children)

      by AthanasiusKircher (5291) on Monday June 10 2019, @09:35PM (#853896) Journal

      True, though if you actually have access to the source code, you should be able to compile it yourself if you so choose. I realize many people may not choose to do that, but it's possible with open source. It's not when the source is not available.

      • (Score: 2) by EvilSS on Monday June 10 2019, @09:51PM (1 child)

        by EvilSS (1456) Subscriber Badge on Monday June 10 2019, @09:51PM (#853906)
        Yea, but you still can't verify the back end software being run by Apple, even if you compile it. To do that, Apple would have to let literally anyone come in at any time and run their own hash function generator against the binaries they are running in production to verify they match the binaries compiled from the open code. I don't see that happening. Now you could have an outside auditor do it, but again, you are forced to trust a third party. After all, the auditor could be bribed or otherwise influenced.
        • (Score: 2) by pkrasimirov on Tuesday June 11 2019, @01:19PM

          by pkrasimirov (3358) Subscriber Badge on Tuesday June 11 2019, @01:19PM (#854185)

          It goes a long way from there. You should be able to check the hashing program too. You should be able to install and use a new one if you want, like SHA-384 or SHA3-512. It should match the checksum of the binary you compiled from source, byte-for-byte. Meaning you need the exact compiler settings and configuration directives. And the exact compiler.

          And check that drivers/kernel do not add some "instrumentation" for "debugging", "monitoring", "telemetrics" etc.

          Also what happens when the information is decrypted? You can check the code does not save it or retain it anywhere but better also check there is no other process to read this memory meanwhile. Including in the Intel Management Engine or equivalent.

          Eventually inevitably it goes to "all or nothing" as the crazy RMS always repeated. But we don't want to be crazy, do we? So we just trust Apple because they say they care for our privacy. And Google because they are not evil. And then the others because we already trust some, they are the same.

          Yeah, Apple said they don't track me. But they will immediately tell me where is my phone if I ask them. Awesome!

  • (Score: 2) by FatPhil on Wednesday June 12 2019, @08:01AM

    by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Wednesday June 12 2019, @08:01AM (#854560) Homepage
    I'd take a security audit from a well-trusted cryptographer. That's how Skype Skype did things (not Microsoft Skype, that doesn't even pass the smell test when it comes to security) - Prof. Nigel Smart from the University of Bristol had access to their white paper, and to their source under NDA, and he verified firstly that the mathematics/cryptography in the white paper was sound, and secondly that the code implemented the algorithms in the white paper. (And he could also verify that the code built the binaries.)

    Of course, I'd *prefer* open source, but it's not the be all and end all of security and trust, and in reality it's barely even the start of security and trust. Data shows that most commercial products that don't have a public developer community behind them from early on, when open sourced, get almost no scrutiny from the public, the myth of a million eyes making all bugs shallow is mostly a fairy story. (I say that with the perspective of someone who's worked almost exclusively on commercial open source software that has had wide community support from very early days, such as Linux, and I can tell the difference between what we do, and what unnamed they do.)
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves