Stories
Slash Boxes
Comments

SoylentNews is people

posted by girlwhowaspluggedout on Sunday March 02 2014, @12:01AM   Printer-friendly
from the one-bad-apple-spoils-the-whole-bunch dept.

Papas Fritas writes:

"Last October, Bruce Schneier speculated that the three characteristics of a good backdoor are a low chance of discovery, high deniability if discovered, and minimal conspiracy to implement. He now says that the critical iOS and OSX vulnerability that Apple patched last week meets these criteria, and could be an example of a deliberate change by a bad actor:

Look at the code. What caused the vulnerability is a single line of code: a second "goto fail;" statement. Since that statement isn't a conditional, it causes the whole procedure to terminate ... Was this done on purpose? I have no idea. But if I wanted to do something like this on purpose, this is exactly how I would do it.

He later added that 'if the Apple auditing system is any good, they will be able to trace this errant goto line to the specific login that made the change.'

Steve Bellovin, professor of Computer Science in Columbia University and Chief Technologist of the Federal Trade Commission, has another take on the vulnerability: 'It may have been an accident; If it was enemy action, it was fairly clumsy.'"

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by AudioGuy on Sunday March 02 2014, @12:41AM

    by AudioGuy (24) on Sunday March 02 2014, @12:41AM (#9297) Journal

    if ((err = SSLHashSHA1.update(&hashCtx, &serverRandom)) != 0)
                    goto fail;
            if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) != 0)
                    goto fail;
                    goto fail;
            if ((err = SSLHashSHA1.final(&hashCtx, &hashOut)) != 0)
                    goto fail;

    I wish *my* C bugs were so obvious as this one. ;-)

    This means that not only was this missed, but that the function, a vital security function, was never even tested.

    The comments on Bruces blog are worth reading as well. https://www.schneier.com/blog/archives/2014/02/was _the_ios_ssl.html [schneier.com]

    It is very hard to see how this one could have been accidental.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=1, Interesting=2, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 5, Interesting) by FatPhil on Sunday March 02 2014, @12:50AM

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Sunday March 02 2014, @12:50AM (#9305) Homepage
    > It is very hard to see how this one could have been accidental.

    Let's pour some kerosene on that...

    http://daringfireball.net/2014/02/apple_prism
    """
    Jeffrey Grossman, on Twitter:
    > I have confirmed that the SSL vulnerability was introduced in iOS 6.0. It is not present in 5.1.1 and is in 6.0.
    iOS 6.0 shipped on 24 September 2012.

    According to slide 6 in the leaked PowerPoint deck on NSA&#226;&#8364;&#8482;s PRISM program, Apple was &#226;&#8364;&#339;added&#226;&#8364; in October 2012.

    These three facts prove nothing; it&#226;&#8364;&#8482;s purely circumstantial. But the shoe fits.
    """
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 1) by forsythe on Sunday March 02 2014, @02:30AM

    by forsythe (831) on Sunday March 02 2014, @02:30AM (#9354)

    was never even tested

    If anything speaks to it being "enemy action", it's this. Anybody could make this error accidentally. Perhaps the chances are astronomically low of somebody making this error and not noticing it, but perhaps the codebase is astronomically large. I can even see something like this being ignored in a diff-only peer review, though that's a bigger stretch.

    Somehow engineering circumstances to have this not tested, however, pretty much has to be intentional.

    • (Score: 2) by frojack on Sunday March 02 2014, @04:22AM

      by frojack (1554) on Sunday March 02 2014, @04:22AM (#9394) Journal

      But, AS I understand the bug, that line of code was meant to detect a bad cert, or man in the middle attack.

      Even in wide scale testing, you are not likely to encounter that in the real world. And in this case, it would just allow the site to load as normal. You'd be owned, but none the wiser. The code would pass the test.

      Adam Langley on his Blog [imperialviolet.org] coded up a cute harmless little demonstrator for this bug.
      This is the direct URL https://www.imperialviolet.org:1266/ [imperialviolet.org]

      Chrome just says No Way.
      Firefox spits confusing jargon that translates to No Way.
      Even cursty old Kong catches this.

      So unless you has a deliberately created bad web site to test with, you would never see this bug. Seems accidental that it was found at all.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 4, Insightful) by forsythe on Sunday March 02 2014, @05:07AM

        by forsythe (831) on Sunday March 02 2014, @05:07AM (#9406)

        Sure, in the real world this bug would be hard to detect. But I find it hard to believe that anyone at Apple would approve a function for detecting bad certs that didn't even have a test record including data that [should have] failed sslRawVerify (which, as I understand it, is the key step that the goto skips). That's the sort of thing big, professional software companies are supposed to do, isn't it? That leaves a few possibilities: either the test record was doctored, the test cases were carefully constructed not to expose this bug, or there simply weren't any tests intended to cover this case.

        Hanlon's Razor says the third case is most likely, but I'm not so sure I should trust it in this case.

  • (Score: 4, Interesting) by chr1sb on Sunday March 02 2014, @04:36AM

    by chr1sb (2778) on Sunday March 02 2014, @04:36AM (#9401)
    The old maxim "Never attribute to malice what can be attributed to incompetence" applies here too. No human is required to write the code in that specific way for the code to exist in that form. This is the kind of defect that can be introduced by e.g. a merge issue between branches. When code has been modified in both branches, the merge tool will do its best but can easily break the code in this way. This is another reason for good unit tests and code reviews, as pointed out by others. Now, if the unit tests were modified to *no longer* detect that circumstance, then I would be suspicious.

    To reduce the likelihood of these issues arising, the code can be structured in a different way, with no need for gotos, more protection against such merge issues and with structural flaws being more obvious:

    static OSStatus
    SSLVerifySignedServerKeyExchange(SSLCont ext *ctx, bool isRsa, SSLBuffer signedParams,
                                     uint8_t *signature, UInt16 signatureLen)
    {
        OSStatus err;

        // some code

        if (   (0 != (err = SSLHashSHA1.update(&hashCtx, &serverRandom)))
            || (0 != (err = SSLHashSHA1.update(&hashCtx, &signedParams)))
            || (0 != (err = SSLHashSHA1.final(&hashCtx, &hashOut))))
        {
            SSLFreeBuffer(&signedHashes);
            SSLFreeBuffer(&hashCtx);
            return err;
        }

        // more code
    }

  • (Score: 2) by maxwell demon on Sunday March 02 2014, @01:15PM

    by maxwell demon (1608) on Sunday March 02 2014, @01:15PM (#9579) Journal

    Dijkstra obviously was right.

    --
    The Tao of math: The numbers you can count are not the real numbers.