Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday June 01 2020, @01:04PM   Printer-friendly
from the OAuth2-isn't-that-hard dept.

What if I say, your Email ID is all I need to takeover your account on your favorite website or an app. Sounds scary, right? This is what a bug in Sign in with Apple allowed me to do.

When Apple announced Sign in with Apple at the June 2019 worldwide developers conference, it called it a "more private way to simply and quickly sign into apps and websites." The idea was, and still is, a good one: replace social logins that can be used to collect personal data with a secure authentication system backed by Apple's promise not to profile users or their app activity.

One of the plus points that got a lot of attention at the time was the ability for a user to sign up with third-party apps and services without needing to disclose their Apple ID email address. Unsurprisingly, it has been pushed as being a more privacy-oriented option than using your Facebook or Google account.

Fast forward to April 2020, and a security researcher from Delhi uncovered a critical Sign in with Apple vulnerability that could allow an attacker to potentially take over an account with just an email ID. A critical vulnerability that was deemed important enough that Apple paid him $100,000 (£81,000) through its bug bounty program by way of a reward.

Considering the level of embarrassment possible for basically leaving the front door unlocked, I'd say the reward was on the light side.

I found I could request JWTs for any Email ID from Apple and when the signature of these tokens was verified using Apple's public key, they showed as valid. This means an attacker could forge a JWT by linking any Email ID to it and gaining access to the victim's account.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Thexalon on Monday June 01 2020, @02:39PM (5 children)

    by Thexalon (636) on Monday June 01 2020, @02:39PM (#1001699)

    Yeah, I've heard that one before.

    The rules of data haven't changed: If information about you is passing through anyone else's computer for any reason, you should assume that sooner or later they will start collecting it, storing it, analyzing it, and possibly selling it off to somebody else, regardless of what their Terms of Service said or how vehemently their PR flaks swear up and down they would never dream of doing such a thing.

    As an example of a company that started out swearing they didn't do anything remotely like what I just described: Google.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 5, Interesting) by Rosco P. Coltrane on Monday June 01 2020, @02:49PM

    by Rosco P. Coltrane (4757) on Monday June 01 2020, @02:49PM (#1001706)

    The real problem is, they can promise all they want, there's so much money to be made in the dataraping business that they won't resist the lure for very long, even if they honestly try to. Giant tech companies promising not to abuse your personal data is exactly like a heroin addict promising to stay off the needle: sure it's possible they'll keep their promise, but most likely they won't, and in any case you can never trust them.

  • (Score: 2) by Mykl on Monday June 01 2020, @11:54PM (3 children)

    by Mykl (1112) on Monday June 01 2020, @11:54PM (#1001941)

    Google have one real source of revenue - Advertising
    Facebook have one real source of revenue - Advertising
    Apple sells products and services - they don't really need advertising revenue. That makes the lure of selling users' data much less attractive. Not saying that it rules it out - just that they are in a wholly different scenario to Facebook and Google.

    • (Score: 4, Insightful) by Thexalon on Tuesday June 02 2020, @12:56AM (2 children)

      by Thexalon (636) on Tuesday June 02 2020, @12:56AM (#1001964)

      You seem to forget a basic rule of corporate America: There's no such thing as "enough" income. If they can do some horrible thing and raise their profits by 0.05% next quarter, that's exactly what they'll do, because some VP somewhere will get rewarded handsomely for those unexpectedly high numbers.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2) by TheRaven on Tuesday June 02 2020, @10:01AM (1 child)

        by TheRaven (270) on Tuesday June 02 2020, @10:01AM (#1002111) Journal
        Yes and no. Yes, a corporation will aim to maximise profit, but that doesn't mean that they will do everything that brings income. If a particular activity would bring a small amount of money in but reduce another revenue stream, they will (usually) not do it. Apple has decided that they want privacy to be a differentiating service. They could make some money from selling the data, but they think they can make a lot more money from being perceived as the company that respects your privacy. Getting caught selling data would undermine a lot of their other business.
        --
        sudo mod me up
        • (Score: 2) by Thexalon on Tuesday June 02 2020, @01:30PM

          by Thexalon (636) on Tuesday June 02 2020, @01:30PM (#1002165)

          If a particular activity would bring a small amount of money in but reduce another revenue stream, they will (usually) not do it.

          You're misunderstanding how major corporations work.

          Let's take an example: A major insurance company had an obscure division selling financial products. The guy in charge of that division wanted to make more money, so he started selling "credit default swaps" en masse and collecting big bucks while lying to his superiors about how much risk they were taking on. That guy, and many members of his staff, got big raises and bonuses for his success in raising revenue. And that all worked out great for him and the company until 2008, when all of a sudden it cratered spectacularly, and the only reason the company in question even survived was from a massive federal bailout.

          This kind of thing is called the "Principal-Agent Problem": Businesses will regularly do things that are bad for the business as a whole, but good for one or more individuals working for the business. So if selling user data is good for somebody, anybody, at Apple, they'll do it.

          They could make some money from selling the data, but they think they can make a lot more money from being perceived as the company that respects your privacy.

          (Emphasis mine)

          Exactly. And that makes it an advertising and public relations problem: You can sell the data, and as long as you lie convincingly about how you're totally not doing that, you get to have your cake and eat it too. And there are ways to avoid being caught doing this, like NDAs.

          That's why, when it comes to your privacy, ignore what they say they're doing, focus on what they can in fact do.

          --
          The only thing that stops a bad guy with a compiler is a good guy with a compiler.