Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday June 01 2020, @01:04PM   Printer-friendly
from the OAuth2-isn't-that-hard dept.

What if I say, your Email ID is all I need to takeover your account on your favorite website or an app. Sounds scary, right? This is what a bug in Sign in with Apple allowed me to do.

When Apple announced Sign in with Apple at the June 2019 worldwide developers conference, it called it a "more private way to simply and quickly sign into apps and websites." The idea was, and still is, a good one: replace social logins that can be used to collect personal data with a secure authentication system backed by Apple's promise not to profile users or their app activity.

One of the plus points that got a lot of attention at the time was the ability for a user to sign up with third-party apps and services without needing to disclose their Apple ID email address. Unsurprisingly, it has been pushed as being a more privacy-oriented option than using your Facebook or Google account.

Fast forward to April 2020, and a security researcher from Delhi uncovered a critical Sign in with Apple vulnerability that could allow an attacker to potentially take over an account with just an email ID. A critical vulnerability that was deemed important enough that Apple paid him $100,000 (£81,000) through its bug bounty program by way of a reward.

Considering the level of embarrassment possible for basically leaving the front door unlocked, I'd say the reward was on the light side.

I found I could request JWTs for any Email ID from Apple and when the signature of these tokens was verified using Apple's public key, they showed as valid. This means an attacker could forge a JWT by linking any Email ID to it and gaining access to the victim's account.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by TheRaven on Tuesday June 02 2020, @10:01AM (1 child)

    by TheRaven (270) on Tuesday June 02 2020, @10:01AM (#1002111) Journal
    Yes and no. Yes, a corporation will aim to maximise profit, but that doesn't mean that they will do everything that brings income. If a particular activity would bring a small amount of money in but reduce another revenue stream, they will (usually) not do it. Apple has decided that they want privacy to be a differentiating service. They could make some money from selling the data, but they think they can make a lot more money from being perceived as the company that respects your privacy. Getting caught selling data would undermine a lot of their other business.
    --
    sudo mod me up
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Thexalon on Tuesday June 02 2020, @01:30PM

    by Thexalon (636) on Tuesday June 02 2020, @01:30PM (#1002165)

    If a particular activity would bring a small amount of money in but reduce another revenue stream, they will (usually) not do it.

    You're misunderstanding how major corporations work.

    Let's take an example: A major insurance company had an obscure division selling financial products. The guy in charge of that division wanted to make more money, so he started selling "credit default swaps" en masse and collecting big bucks while lying to his superiors about how much risk they were taking on. That guy, and many members of his staff, got big raises and bonuses for his success in raising revenue. And that all worked out great for him and the company until 2008, when all of a sudden it cratered spectacularly, and the only reason the company in question even survived was from a massive federal bailout.

    This kind of thing is called the "Principal-Agent Problem": Businesses will regularly do things that are bad for the business as a whole, but good for one or more individuals working for the business. So if selling user data is good for somebody, anybody, at Apple, they'll do it.

    They could make some money from selling the data, but they think they can make a lot more money from being perceived as the company that respects your privacy.

    (Emphasis mine)

    Exactly. And that makes it an advertising and public relations problem: You can sell the data, and as long as you lie convincingly about how you're totally not doing that, you get to have your cake and eat it too. And there are ways to avoid being caught doing this, like NDAs.

    That's why, when it comes to your privacy, ignore what they say they're doing, focus on what they can in fact do.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.