Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 10 submissions in the queue.
posted by Fnord666 on Monday June 01 2020, @01:04PM   Printer-friendly
from the OAuth2-isn't-that-hard dept.

What if I say, your Email ID is all I need to takeover your account on your favorite website or an app. Sounds scary, right? This is what a bug in Sign in with Apple allowed me to do.

When Apple announced Sign in with Apple at the June 2019 worldwide developers conference, it called it a "more private way to simply and quickly sign into apps and websites." The idea was, and still is, a good one: replace social logins that can be used to collect personal data with a secure authentication system backed by Apple's promise not to profile users or their app activity.

One of the plus points that got a lot of attention at the time was the ability for a user to sign up with third-party apps and services without needing to disclose their Apple ID email address. Unsurprisingly, it has been pushed as being a more privacy-oriented option than using your Facebook or Google account.

Fast forward to April 2020, and a security researcher from Delhi uncovered a critical Sign in with Apple vulnerability that could allow an attacker to potentially take over an account with just an email ID. A critical vulnerability that was deemed important enough that Apple paid him $100,000 (£81,000) through its bug bounty program by way of a reward.

Considering the level of embarrassment possible for basically leaving the front door unlocked, I'd say the reward was on the light side.

I found I could request JWTs for any Email ID from Apple and when the signature of these tokens was verified using Apple's public key, they showed as valid. This means an attacker could forge a JWT by linking any Email ID to it and gaining access to the victim's account.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Funny) by Rosco P. Coltrane on Monday June 01 2020, @02:25PM (6 children)

    by Rosco P. Coltrane (4757) on Monday June 01 2020, @02:25PM (#1001693)

    Well, color me convinced.

    • (Score: 4, Insightful) by Thexalon on Monday June 01 2020, @02:39PM (5 children)

      by Thexalon (636) on Monday June 01 2020, @02:39PM (#1001699)

      Yeah, I've heard that one before.

      The rules of data haven't changed: If information about you is passing through anyone else's computer for any reason, you should assume that sooner or later they will start collecting it, storing it, analyzing it, and possibly selling it off to somebody else, regardless of what their Terms of Service said or how vehemently their PR flaks swear up and down they would never dream of doing such a thing.

      As an example of a company that started out swearing they didn't do anything remotely like what I just described: Google.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 5, Interesting) by Rosco P. Coltrane on Monday June 01 2020, @02:49PM

        by Rosco P. Coltrane (4757) on Monday June 01 2020, @02:49PM (#1001706)

        The real problem is, they can promise all they want, there's so much money to be made in the dataraping business that they won't resist the lure for very long, even if they honestly try to. Giant tech companies promising not to abuse your personal data is exactly like a heroin addict promising to stay off the needle: sure it's possible they'll keep their promise, but most likely they won't, and in any case you can never trust them.

      • (Score: 2) by Mykl on Monday June 01 2020, @11:54PM (3 children)

        by Mykl (1112) on Monday June 01 2020, @11:54PM (#1001941)

        Google have one real source of revenue - Advertising
        Facebook have one real source of revenue - Advertising
        Apple sells products and services - they don't really need advertising revenue. That makes the lure of selling users' data much less attractive. Not saying that it rules it out - just that they are in a wholly different scenario to Facebook and Google.

        • (Score: 4, Insightful) by Thexalon on Tuesday June 02 2020, @12:56AM (2 children)

          by Thexalon (636) on Tuesday June 02 2020, @12:56AM (#1001964)

          You seem to forget a basic rule of corporate America: There's no such thing as "enough" income. If they can do some horrible thing and raise their profits by 0.05% next quarter, that's exactly what they'll do, because some VP somewhere will get rewarded handsomely for those unexpectedly high numbers.

          --
          The only thing that stops a bad guy with a compiler is a good guy with a compiler.
          • (Score: 2) by TheRaven on Tuesday June 02 2020, @10:01AM (1 child)

            by TheRaven (270) on Tuesday June 02 2020, @10:01AM (#1002111) Journal
            Yes and no. Yes, a corporation will aim to maximise profit, but that doesn't mean that they will do everything that brings income. If a particular activity would bring a small amount of money in but reduce another revenue stream, they will (usually) not do it. Apple has decided that they want privacy to be a differentiating service. They could make some money from selling the data, but they think they can make a lot more money from being perceived as the company that respects your privacy. Getting caught selling data would undermine a lot of their other business.
            --
            sudo mod me up
            • (Score: 2) by Thexalon on Tuesday June 02 2020, @01:30PM

              by Thexalon (636) on Tuesday June 02 2020, @01:30PM (#1002165)

              If a particular activity would bring a small amount of money in but reduce another revenue stream, they will (usually) not do it.

              You're misunderstanding how major corporations work.

              Let's take an example: A major insurance company had an obscure division selling financial products. The guy in charge of that division wanted to make more money, so he started selling "credit default swaps" en masse and collecting big bucks while lying to his superiors about how much risk they were taking on. That guy, and many members of his staff, got big raises and bonuses for his success in raising revenue. And that all worked out great for him and the company until 2008, when all of a sudden it cratered spectacularly, and the only reason the company in question even survived was from a massive federal bailout.

              This kind of thing is called the "Principal-Agent Problem": Businesses will regularly do things that are bad for the business as a whole, but good for one or more individuals working for the business. So if selling user data is good for somebody, anybody, at Apple, they'll do it.

              They could make some money from selling the data, but they think they can make a lot more money from being perceived as the company that respects your privacy.

              (Emphasis mine)

              Exactly. And that makes it an advertising and public relations problem: You can sell the data, and as long as you lie convincingly about how you're totally not doing that, you get to have your cake and eat it too. And there are ways to avoid being caught doing this, like NDAs.

              That's why, when it comes to your privacy, ignore what they say they're doing, focus on what they can in fact do.

              --
              The only thing that stops a bad guy with a compiler is a good guy with a compiler.
  • (Score: 0) by Anonymous Coward on Monday June 01 2020, @02:46PM (10 children)

    by Anonymous Coward on Monday June 01 2020, @02:46PM (#1001703)

    Could the FBI have used this to unlock phones? I don't really use apple stuff much.

    • (Score: 3, Interesting) by JoeMerchant on Monday June 01 2020, @03:19PM (8 children)

      by JoeMerchant (3937) on Monday June 01 2020, @03:19PM (#1001718)

      Maybe not the phones, but any 3rd party app you used Apple's Sign In service with - they wouldn't even need your phone, just your AppleID email address (and no password).

      --
      🌻🌻 [google.com]
      • (Score: 2) by TheRaven on Tuesday June 02 2020, @10:08AM (7 children)

        by TheRaven (270) on Tuesday June 02 2020, @10:08AM (#1002116) Journal

        Note that you needed two bugs for this to work. Apple's bug let you create a JWT for your user's unique ID but with someone else's email address associated with it. This can then be combined with another bug where the entity using this identity uses the email address and not the unique ID as the identity.

        If you use the feature that this bug is exploiting, where the service provider gets a throw-away email address, then it's hard to exploit this because the attacker won't know that randomly-generated throw-away address.

        --
        sudo mod me up
        • (Score: 2) by JoeMerchant on Tuesday June 02 2020, @12:12PM (6 children)

          by JoeMerchant (3937) on Tuesday June 02 2020, @12:12PM (#1002144)

          I don't think that Sign in with Apple was made to work with randomly-generated throw-away addresses...

          --
          🌻🌻 [google.com]
          • (Score: 2) by TheRaven on Tuesday June 02 2020, @12:39PM (5 children)

            by TheRaven (270) on Tuesday June 02 2020, @12:39PM (#1002154) Journal

            Yes it was. That is precisely its selling point: whenever you sign in with Apple, you have a choice of providing your email address or a throw-away one. Apple creates a random throw-away one for you and will relay email from there to your real address, so the owner of the service never sees your email and can't use it to track you across other sign-in-with-Apple services. You can sign in to everything with a single login, but with different identities that can't be linked by anyone (except Apple, who promises not to).

            That's also the root cause of this bug: Apple provided the one-shot email address in the query and let you choose whether to provide that one or your original by sending the email address that you wanted as a string (this also made it possible at the API layer to provide your own one-shot address for the service). Apple didn't validate this, so you could provide any string here and Apple would put it in the JWT.

            --
            sudo mod me up
            • (Score: 2) by JoeMerchant on Tuesday June 02 2020, @12:49PM (4 children)

              by JoeMerchant (3937) on Tuesday June 02 2020, @12:49PM (#1002155)

              Ah... seems like weak sauce to start with, but I see the commercial appeal. Craigslist and others have been doing similar things for a long time, but as a single server not an Oauth type stand-off.

              Seems like anyone wanting to anonymize their e-mail communication with a 3rd party should just make a binary choice: anonymize yes/no? If yes, they shouldn't have anything to do with selection of the ID used to carry e-mails.

              --
              🌻🌻 [google.com]
              • (Score: 2) by TheRaven on Tuesday June 02 2020, @04:20PM (3 children)

                by TheRaven (270) on Tuesday June 02 2020, @04:20PM (#1002210) Journal
                They don't, at the UI level. The UI says 'share your email address, yes / no'. This is implemented by a form submission that passes either the user's email address or a one-shot one (which was generated when the page was generated) back to Apple. Apple trusted the form submission from this page because they generated it and forgot that the client could tamper with it.
                --
                sudo mod me up
                • (Score: 2) by JoeMerchant on Tuesday June 02 2020, @04:35PM (2 children)

                  by JoeMerchant (3937) on Tuesday June 02 2020, @04:35PM (#1002216)

                  Assumption #2 in security: anything that can be tampered with, will be.

                  I just finished an impromptu review of our software update functionality - had a request to try to use it as a backdoor to exfiltrate information to USB stick, that wasn't in the original design spec, but it seemed like something as powerful as a software update mechanism might be able to pull it off: if the user/attacker had the proper update signing key. Unfortunately for those hoping for a quick-fix to the problem du-jour, it appears that the software update function, as implemented, cannot be twisted to effectively become an arbitrary file export to USB mechanism unless it is actually used via its intended function: to update the device software.

                  Bottom line: in an ideal implementation, software only does what it was designed to do, nothing more. Any unintended functional scope is a potential security hole.

                  --
                  🌻🌻 [google.com]
                  • (Score: 2) by TheRaven on Tuesday June 02 2020, @05:28PM (1 child)

                    by TheRaven (270) on Tuesday June 02 2020, @05:28PM (#1002247) Journal
                    In Apple's defence, the email address is untrusted in their model. It's not the unique ID, so if you do tamper with it and the service is correctly implemented then all you can do is make it send emails containing your data to the wrong person. It's a combination of Apple's bug and other people using the wrong identifier as the client ID that causes this problem. If people used the Apple APIs correctly, the Apple bug wouldn't allow you to log in as someone else, it would just allow you to send password recovery emails to the wrong person. From Apple's perspective, the only person you can attack is yourself. Unfortunately, security usability is a real thing and the easiest way of using Apple's API was incorrectly, using the email address (which may change between logins) as the unique ID.
                    --
                    sudo mod me up
                    • (Score: 2) by JoeMerchant on Tuesday June 02 2020, @05:52PM

                      by JoeMerchant (3937) on Tuesday June 02 2020, @05:52PM (#1002261)

                      Hang on, I've heard this one before:

                      the easiest way of using Apple's API was incorrectly

                      You're holding it wrong!

                      --
                      🌻🌻 [google.com]
    • (Score: 2) by linkdude64 on Monday June 01 2020, @03:38PM

      by linkdude64 (5482) on Monday June 01 2020, @03:38PM (#1001726)

      Nah, I think they just spammed the number 9 on the login screen with their thumbs really fast and caused a buffer overflow.

  • (Score: 3, Funny) by DannyB on Monday June 01 2020, @04:00PM (2 children)

    by DannyB (5839) Subscriber Badge on Monday June 01 2020, @04:00PM (#1001738) Journal

    This is why we should only trust a single sign on system offered by Facebook.

    Then we can be confident of how both our security and privacy will be treated.

    --
    When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
    • (Score: 2) by JoeMerchant on Monday June 01 2020, @05:09PM (1 child)

      by JoeMerchant (3937) on Monday June 01 2020, @05:09PM (#1001786)

      A double negative makes a positive, that must explain the popularity of fecebook.

      --
      🌻🌻 [google.com]
      • (Score: 2) by DannyB on Monday June 01 2020, @05:16PM

        by DannyB (5839) Subscriber Badge on Monday June 01 2020, @05:16PM (#1001791) Journal

        I tried telling that to an English teacher once.

        --
        When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
  • (Score: 4, Informative) by progo on Monday June 01 2020, @05:40PM (3 children)

    by progo (6356) on Monday June 01 2020, @05:40PM (#1001802) Homepage

    Steve Gibson spent years developing SQRL -- development went slowly because he wanted NO mistakes in the specification that would violate the privacy that the system promises to users. It's been spec-frozen and ready for use, for a while now. The basic idea is simple and any big company could have invented it and pushed it as a free standard for the world. But in reality no one cares.

    SQRL lets you use one login for everything, from an app on your phone or your computer or both -- you can export and import your identity via a simple offline file. When you login to a service with SQRL the service knows nothing about you -- No one in the world but you knows where else you've signed in with this identity, unless you explicitly share more information with those services such as your real name or primary email address.

    If you know about SQRL, you can look at the Apple hack for centralized identity assertion, and ask "but what's the point?"

    • (Score: 2) by progo on Monday June 01 2020, @05:49PM (2 children)

      by progo (6356) on Monday June 01 2020, @05:49PM (#1001803) Homepage
      • (Score: 2) by JoeMerchant on Monday June 01 2020, @07:02PM (1 child)

        by JoeMerchant (3937) on Monday June 01 2020, @07:02PM (#1001824)

        See also: OAuth2

        I need a similar service for a device we're developing - and it took all of an hour to not only sketch out but prototype a functional authentication server proof of concept. After I did that I checked with our "powers that be" and apparently they're using Microsoft's OpenID OAuth2.0 implementation, which may - or may not - provide what I need from an authentication server. If the establishment's server doesn't work with devices the way I want it to, I'll just have to set up a server in the middle that will provide a translation layer at a fixed IP that the OAuth server can call back to.

        This stuff is well established, and if you're not totally ADD the specs are about as simple as they come to implement and test. First question: can you get an authorization token without a valid username/password being given to the Auth server? If so: FAIL. How hard is that?

        --
        🌻🌻 [google.com]
        • (Score: 1, Interesting) by Anonymous Coward on Monday June 01 2020, @10:56PM

          by Anonymous Coward on Monday June 01 2020, @10:56PM (#1001923)

          SQRL is shit for a number of reasons, and isn't a replacement for anything, let alone OAuth2. There is a reason why amateur hour at GRC needed almost 7 years to get it halfway decent. Still doesn't change the broken fundamentals though. It is also a copy of a method created earlier and may actively be covered by patent, but then what do you expect from someone who infamously reinvented a shittier form of SYN cookies and has a history of over promising and under delivering?

  • (Score: 2) by zeigerpuppy on Tuesday June 02 2020, @01:35AM

    by zeigerpuppy (1298) on Tuesday June 02 2020, @01:35AM (#1001980)

    Single sign on is a complex problem by nature but there are some really good open source solutions.

    SimpleSAMLphp implements the SAML protocol and is fairly easy to set up

    Gitlab (self hosted) also includes and OAuth backend, which is also easy to set up
    https://docs.gitlab.com/ee/integration/oauth_provider.html [gitlab.com]

    Also, do yourself a favour and store your passwords in the KeePass protocol.
    KeeWeb is a great reader for KeePass databases which can be accessed via NextCloud as well.
    Also, there's a good Android client KeePassDX.
    Easy to keep in sync between devices by storing your passwords (and OTP keys) in a synchronised Keepass file.

(1)