Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday March 02 2020, @07:36PM   Printer-friendly
from the cat-and-mouse dept.

Don't run your 2FA authenticator app on these smartphones:

Aaron Turner and Georgia Weidman emphasized that using authenticator apps, such as Authy or Google Authenticator, in two-factor authentication was better than using SMS-based 2FA. But, they said, an authenticator app is useless for security if the underlying mobile OS is out-of-date or the mobile device is otherwise insecure.

[...] The problem is that if an attacker or a piece of mobile malware can get into the kernel of iOS or Android, then it can do anything it wants, including presenting fake authenticator-app screens.

[...] And don't think iOS devices are safer than Android ones -- they're not. There are just as many known exploits for either one, and Weidman extracted the encryption keys from an older iPhone in a matter of seconds onstage.

The iPhone's Secure Enclave offers "some additional security, but the authenticator apps aren't using those elements," said Weidman, founder and chief technology officer of Washington-area mobile security provider Shevirah, Inc. "iOS is still good, but Android's [security-enhanced] SELinux is the bane of my existence as someone who's building exploits."

"We charge three times as much for an Android pentest than we charge for an iOS one," Turner said, referring to an exercise in which hackers are paid by a company to try to penetrate the company's security. "Fully patched Android is more difficult to go after."

[...] In short, "we need to move away from usernames and passwords," Turner said.

[...] Turner [said] "I am fundamentally opposed to using biometrics because it's non-revocable," he said, citing a famous case from Malaysia in which a man's index finger was cut off by a gang to steal the man's fingerprint-protected Mercedes. "Fingerprint readers are biometric toys."

The only form of two-factor authentication without security problems right now, Turner said, is a hardware security key such as a Yubikey or Google Titan key.

"I've got two Yubikeys on me right now," Turner said. "Hardware separation is your friend."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Insightful) by Anonymous Coward on Monday March 02 2020, @08:03PM

    by Anonymous Coward on Monday March 02 2020, @08:03PM (#965652)

    But, they said, an authenticator app is useless for security if the underlying mobile OS is out-of-date or the mobile device is otherwise insecure.

    In other words: Not controlled by your corporate masters.

  • (Score: 0) by Anonymous Coward on Monday March 02 2020, @08:14PM

    by Anonymous Coward on Monday March 02 2020, @08:14PM (#965655)

    But, they said, [$security_sensitive_app] is useless for security if the underlying OS is [in any way] insecure.

    Yes, we know that, and we have been saying that for years. So, Google is finally acknowledging that walled-garden, binary-only devices should not be used for security porpoises?

  • (Score: 3, Insightful) by Anonymous Coward on Monday March 02 2020, @08:32PM (2 children)

    by Anonymous Coward on Monday March 02 2020, @08:32PM (#965662)

    But, they said, an authenticator app is useless for security if the underlying mobile OS is out-of-date or the mobile device is otherwise insecure.

    And here we have a wonderful example of the attitudes common amongst computer security people these days: a security system must be perfect, otherwise it is useless. Make sure you buy the latest gadget or that uselessness will rub off on you, and you don't want to be useless, do you? You must achieve perfection at all costs.

    • (Score: 2) by DannyB on Monday March 02 2020, @09:11PM (1 child)

      by DannyB (5839) Subscriber Badge on Monday March 02 2020, @09:11PM (#965684) Journal

      Whatever imperfection there is, will be found and exploited, in time.

      Then some obscure imperfection is a gaping security hole.

      --
      When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
      • (Score: 0) by Anonymous Coward on Monday March 02 2020, @09:38PM

        by Anonymous Coward on Monday March 02 2020, @09:38PM (#965696)

        Whatever imperfection there is, will be found and exploited, in time.

        I doubt someone will dedicate resources to target code I compiled from source, while everyone else is on auto-upgrade.

  • (Score: 0) by Anonymous Coward on Monday March 02 2020, @09:17PM (3 children)

    by Anonymous Coward on Monday March 02 2020, @09:17PM (#965686)

    "Fully patched Android is more difficult to go after."

    ...and there's the rub.

    • (Score: 2) by DannyB on Monday March 02 2020, @09:19PM (2 children)

      by DannyB (5839) Subscriber Badge on Monday March 02 2020, @09:19PM (#965687) Journal

      When Win is it ever fully patched?

      Today's fully patched is tomorrow's: "why, oh why, didn't you have the latest patches!?"

      --
      When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
      • (Score: 0) by Anonymous Coward on Tuesday March 03 2020, @12:38AM

        by Anonymous Coward on Tuesday March 03 2020, @12:38AM (#965763)

        The gp's point is that most android devices don't get the patches for known vulnerability as the manufacturer doesn't feel like patching.

        There's always going to be vulnerabilities, but there's really no excuse for the limited patching.

      • (Score: 2) by coolgopher on Tuesday March 03 2020, @05:14AM

        by coolgopher (1157) on Tuesday March 03 2020, @05:14AM (#965861)

        Today's "fully patched" is tomorrow's "but this used to work correctly".

  • (Score: 2) by DannyB on Monday March 02 2020, @09:27PM (2 children)

    by DannyB (5839) Subscriber Badge on Monday March 02 2020, @09:27PM (#965690) Journal

    1. Something you know
    2. Something you have
    3. Something you are

    Item 1 would be a password, pin. (or where the bodies are buried.)

    Item 2 would be a building key, a smart card, a keyring fob. (or a headache.)

    Item 3 would be your fingerprint, blood vessel patterns, DNA. (or stupid.)

    If you must use factor 3, something you are, it seems fingerprints might not be the best. From the movie Demolition Man, eyeball retina scans might not be so great either. With DNA, maybe the bad guys poke you and draw some blood.

    Factor 1, something you know, seems useful. When coerced, you could give a different password, pin or information which provides the wanted access, but also sets off a silent alarm. In certain situations, an alternate password that locks up the entire system might be useful, if your life is worth less than what is protected, such as a nuclear weapon. But then you might not be using an Android app for this. Unless you're the president and still running Android 3.

    --
    When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
    • (Score: 1, Insightful) by Anonymous Coward on Monday March 02 2020, @10:34PM

      by Anonymous Coward on Monday March 02 2020, @10:34PM (#965716)

      Factor 3 is not applicable to remote authentication. In-person, local authentication, it would be possible, but as soon as you're talking to a remote server (or "cloud"), factor 3 just reduces to something you have (a recognized "factor 3" reader) and something you know (the DNA/fingerprint/smell or the user).

      And something we already know is in that worldwide database of fingerprints, courtesy of all governments taking them at passport issuance and sending them halfway across the world for "processing".

    • (Score: 2) by looorg on Tuesday March 03 2020, @02:00AM

      by looorg (578) on Tuesday March 03 2020, @02:00AM (#965800)

      If option 3 is just bad cause of Demolition Man (I recall someone had an eyeball on a pen or something, it was such a long time ago) then isn't fingerprints equally bad? After all chopping of a finger is probably easier then gauging out an eyeball and keeping that around -- both probably require a certain amount of maintenance to keep viable for any extended periods of time.

  • (Score: 1) by nitehawk214 on Monday March 02 2020, @10:09PM

    by nitehawk214 (1304) on Monday March 02 2020, @10:09PM (#965710)

    Internally we call our 2fa via the google authenticator app the STP, or "security theater project". If you can remotely access someone's phone, you can likely read their email, so they are just as fucked.

    --
    "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
  • (Score: 1) by ze on Monday March 02 2020, @11:10PM

    by ze (8197) on Monday March 02 2020, @11:10PM (#965724)

    tldr: Basically, the machine uses the same way your friends/family/pets know you're you, only if one of them specialized in it 24/7 with super-human (+additional) senses and reliability. But please read all the nuances if you want to argue ;)

    This might sound crazy to a lot, but what I think I'd like for authentication is just for the machine to know me, and the condition I'm in, better than any human could, on the basis of >=dozens of factors correlated from so many sensor inputs, and sufficiently accurate recognition systems.
    For example, the sound of a single heart beat (which, I find, an amplified and sensitive mic can pick up from 20 feet away) probably has a unique acoustic resonance specific to each body... same for the sound of a single breath... add in actual heart rate and breathing patterns, recognition of faces, height, weight, build, and other specific physical characteristics, mannerisms, voice, odors, etc etc.
    Human bodies are full of unique signatures, and if you correlate enough of them and track changes over time, you'll get a unique and reliable recognition profile for an individual and their mental/physical states.
    This would require tons of training, and on devices you trust to keep this info secure (no cloud or phoning home)... the device that knows you may even stay on your person, maybe with daily data backups stored in a safe. Basically acting as a hardware security key, but one that won't authenticate for anyone but you.
    It should train its profile on you on an ongoing and continuous basis so that it can tell not only your baseline on each parameter, but the typical variances, as well as how they may shift over time, and their recent history...
    This means it doesn't just recognize your typical state, but your full range, and keeps up to date on your present condition. Even if someone could capture all this data and play it back to the system convincingly enough, it would be able to tell if it was too stale of data. It would still recognize you after you gained weight, because it watched you do it, and would see a big red flag if you inexplicably lost it all in a day. It would alert to any inconsistencies, like if it detected an injury one moment that went away the next. It could also be on alert if it loses and regains awareness of you, and treat the re-established data stream with extra scrutiny for inconsistencies until it's seen enough to regain confidence (or it might even be an alert condition in itself for that to happen). Also it could tell if there was a doppelganger trying to fool it while it could still see you there too. Basically, it's easy to fake any single metric, but the whole messy collection would be sensitive to anything that's wrong about it; not in a flaky, fingerprint reader fails whenever it can't get a perfect match, way, but a robust, "I can't get clear and complete readings on your fingerprints right now, but it's ok because what I can see of them is useful and correct, and a hundred other factors are too, and none of them or their relationships look alarmingly unusual for you" (unless they actually do) kind of way.
    Note I've also mentioned mental state... it can see your expression, heart rate, breathing, pupil diameter, and so many other subtle and correlatable factors all the time, and could learn to distinguish your states more reliably than your parents or partner can.

    "I'm sorry, Dave, but it looks like you're being coerced. I can't open the door until either you or your mysterious companions step out of the secure entryway so we can chat about it." for one example of how it might go...

    Note also that I am against biometrics that lack this level of sophistication. If it can't tell that you're really you, and that you're alive and well (as much as can reasonably be expected), and not being coerced (and distinguish that from just being irritated, excited, exercising recently, etc), then it's just a liability/backdoor.
    Also of potential relevance is that the actual private crypto keys your polymetric device uses to sign auth requests with can be revoked if needed. But your polymetric profile would never need to be revoked, 'cause no one can sufficiently steal it from you.

    The main weakness I can anticipate is if all this data can be captured by your device for a profile, other devices may attempt to capture just as much to build their own profile with which to create a convincing simulation of you. Its constant awareness of you could be one safeguard, as could mixing in metrics that can't be read without physical contact or cooperation, as well as the overall tracking of your condition and its trajectory, and the system seeking a deep level of self-consistency that less complete/current profiles couldn't live up to. Also, even with the data, just simulating a consistent person to such detail seems like a much harder and higher-resource problem than merely recognizing what your body does naturally. And maybe we could continue finding new non-derivative factors to add to the profile over time, making it even harder on anyone trying to build a fake one.
    But it's also possible this scheme wouldn't really work as well as it would need to. *shrug*

  • (Score: 4, Informative) by Beryllium Sphere (r) on Tuesday March 03 2020, @03:27AM (1 child)

    by Beryllium Sphere (r) (5062) on Tuesday March 03 2020, @03:27AM (#965834)

    The photo on your driver's license is a biometric. You don't need to revoke it (plastic surgery?) if someone takes another picture of you. You don't need to revoke it if someone steals your driver's license.

    Security for a biometric comes, not from secrecy, but from having a trustworthy authentication channel. If you print out a picture of someone else's driver's license photo and hold it up in front of your face to the officer who pulls you over, the credential theft will not succeed.

    The authentication channel has so be smart enough to "accept no substitutes". If it can't tell the difference between a finger and a Gummi Bear, then that's where the problem is, not the fact that you can't grow replacement fingers. If it can tell the difference between a living face and a professionally produced mannequin, as I've heard Apple's Face ID can do, then the threat model doesn't include anything that revocation could fix.

    • (Score: 2) by coolgopher on Tuesday March 03 2020, @05:17AM

      by coolgopher (1157) on Tuesday March 03 2020, @05:17AM (#965863)

      I'd still rather have a password/phrase that I can divulge than have someone take a finger because of an accept-no-substitute biometric security thingy.

  • (Score: 2) by stormwyrm on Tuesday March 03 2020, @03:48AM

    by stormwyrm (717) on Tuesday March 03 2020, @03:48AM (#965838) Journal

    As I've recently noted though [soylentnews.org], FIDO2 hardware key support is actually not that common. Of the online services I regularly use, only Google, Namecheap, and Github actually have any support for FIDO2. Amazon and PayPal, despite being very high-value accounts, don't have support for FIDO2. They only support TOTP two-factor authentication, with SMS as the only backup option. No thanks, I'm not trusting ultimate security of my accounts to my mobile carrier. Maybe I'll save the authenticator secret somewhere safe (as the AC poster in my journal suggests) to make recovery possible.

    --
    Numquam ponenda est pluralitas sine necessitate.
(1)