Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by girlwhowaspluggedout on Sunday March 02 2014, @12:01AM   Printer-friendly
from the one-bad-apple-spoils-the-whole-bunch dept.

Papas Fritas writes:

"Last October, Bruce Schneier speculated that the three characteristics of a good backdoor are a low chance of discovery, high deniability if discovered, and minimal conspiracy to implement. He now says that the critical iOS and OSX vulnerability that Apple patched last week meets these criteria, and could be an example of a deliberate change by a bad actor:

Look at the code. What caused the vulnerability is a single line of code: a second "goto fail;" statement. Since that statement isn't a conditional, it causes the whole procedure to terminate ... Was this done on purpose? I have no idea. But if I wanted to do something like this on purpose, this is exactly how I would do it.

He later added that 'if the Apple auditing system is any good, they will be able to trace this errant goto line to the specific login that made the change.'

Steve Bellovin, professor of Computer Science in Columbia University and Chief Technologist of the Federal Trade Commission, has another take on the vulnerability: 'It may have been an accident; If it was enemy action, it was fairly clumsy.'"

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by jt on Sunday March 02 2014, @12:07AM

    by jt (2890) on Sunday March 02 2014, @12:07AM (#9283)

    Tracing it back to the single login associated with the change doesn't necessarily identify who was really responsible. If I were a three letter agency and wanted to introduce a weakness, surely I would 'borrow' the credentials of someone who either did not know, or who could be bribed/coerced into doing this?

    It's bad news either way. Either it's been introduced deliberately, by goodness knows who, or it's an honest error which somehow managed to evade the review process. Come on, it's a duplicate line, and it stands out immediately when you skim through the code.

    • (Score: 4, Interesting) by frojack on Sunday March 02 2014, @12:45AM

      by frojack (1554) on Sunday March 02 2014, @12:45AM (#9301) Journal

      But a three letter agency might have been able to disguise it a little better, don't you think? (Unless they were going for deniability rather than long-term endurance).

      If every other browser on every other system barfs on a bad cert, you have to ask why a three letter agency would want to compromise only APPLE products.

      It may stand out immediately when you skim that tiny section of code, but when you skim a mountain of code you could easily miss this.

      You really need to see the change patch that was put in. If that entire section was put in as one change, I'd suspect clear intent.

      On the other hand if the second IF statement went in to replace one that was already there it would pretty easy to be off-by-one line number on the patch, leaving the second goto as a remnant.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 5, Insightful) by WildWombat on Sunday March 02 2014, @02:15AM

        by WildWombat (1428) on Sunday March 02 2014, @02:15AM (#9347)

        I don't have any clue whether or not that line was put there purposely or not but according to Jacob Appelbaum [youtube.com] in a talk he gave at 30c3 the NSA has been able to own any Apple machine they want for a long time now. I think it is probable that even if the NSA didn't plant that line there that they were aware of it.

        --"But a three letter agency might have been able to disguise it a little better, don't you think? (Unless they were going for deniability rather than long-term endurance)."

        Maybe, or maybe it was but the most obvious of many backdoors they have. Its impossible to know, since instead of protecting the American public like they're supposed to and fixing these types of flaws, they hoard them in order to use them and leave all of us vulnerable.

        Cheers,
        -WW

      • (Score: 4, Insightful) by mojo chan on Sunday March 02 2014, @11:25AM

        by mojo chan (266) on Sunday March 02 2014, @11:25AM (#9522)

        It looks like a merging error, where someone wanted to merge their new code with someone else's changes and bungled it. The NSA/GCHQ must love bugs like this: highly deniable but also apparently easy to miss for years. As for why it only targets Apple products it's probably just a case of they had the opportunity and took it.

        --
        const int one = 65536; (Silvermoon, Texture.cs)
    • (Score: 5, Interesting) by MichaelDavidCrawford on Sunday March 02 2014, @04:02AM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Sunday March 02 2014, @04:02AM (#9382) Homepage Journal

      In 1995 and 1996, I was a "Debug Meister" on "The Team Formerly Known As The Blue Meanies" - quality zealots who all used to wear bright blue t-shirts during the development of Mac OS System 7 in the late eighties. We were properly known as "Traditional Operating System Integration". Usually we called ourselves "TradOS".

      There was also a "Modern OS Integration" team. I was offered an internal transfer to it, but I had the vague sense that Copland would never ship. :-/

      Every single one of us had commit privileges to just about all of Apple's source code. I myself kept around the source to the MacsBug machine debugger. It could dissamble both 68k and PowerPC binaries, and had this really cool ability to dissasemble backwards, that is, a few instructions before a breakpoint. MacsBug did not always get it right but it usually did.

      From time to time I'd roll myself a custom feature into MacsBug, to use while isolating some random bug in a new build of either 7.5.2 or 7.5.3. 7.5.2 supported the first PCI bus PowerPC macs - the 7500, 8500 and 9500. 7.5.3 was for the "Speed Bumps", the 7600, 8600 and 9600.

      It would not have been hard at all to have observed my colleague going to lunch, then to have stepped into his office then to have made that one-line SSL hack.

      There is quite a famous story about how greg robbins, one of the authors of the PowerPC graphing calculator, managed to keep working for a year at apple, despite no longer being a contract programmer there, and no longer getting paid.

      It was only when some Apple security guard managed to figure out that greg didn't have card key that he got caught out.

      "You're not on the payroll?" asked the guard incredulously.

      "No, but there's a lot of work left to do to support the PCI macs."

      Greg was one of my fellow debug meisters.

      It turns out that the QA lab where he worked - as a contract programmer, he did not have a private office - was right next to my own office.

      I'm not dead certain but I have reason to believe that for much of the year that he provide free labor to The Cupertino Fruit Company, quite often I was the one to use my card key to let him into the lab so he could start his workday.

      --
      Yes I Have No Bananas. [gofundme.com]
  • (Score: 3, Insightful) by FatPhil on Sunday March 02 2014, @12:37AM

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Sunday March 02 2014, @12:37AM (#9294) Homepage
    From the later link: "On the gripping hand, the error is noticeable by anyone poking at the file, and it's one of the pieces of source code that Apple publishes, which means it's not a great choice for covert action by the NSA or Unit 61398."

    Only an idiot would attempt to hide an exploit in open source software.
    So there's no need to even check to see if there are such back doors.
    Which makes it the best place to hide!

    However, jesting aside, how this didn't flag a compiler diagnostic, I don't know. Or a coverity/purify warning about "unreachable code" or similar. This is dreadfully shoddy work from Apple, or whatever bottom of the barrel third-rate subcontractors were responsible from the code.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 4, Interesting) by frojack on Sunday March 02 2014, @01:13AM

      by frojack (1554) on Sunday March 02 2014, @01:13AM (#9314) Journal

      I thought Bellovin blog was a better analysis.

      I know all the compilers I use would have flagged the third IF as unreachable code.

      However, most compiler will allow this with merely a warning, because sometimes programmers branch around stuff for testing purposes. (This is most competent programming shops doesn't allow any warnings in production code.)

      Maybe Apple has lower standards, or maybe their compiler doesn't flag that kind of error.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 5, Informative) by mojo chan on Sunday March 02 2014, @11:31AM

        by mojo chan (266) on Sunday March 02 2014, @11:31AM (#9524)

        GCC doesn't flag it, and neither do quite a few other compilers. The problem is that code being unreachable is only determined after optimization, and so different optimization levels can produce different results. That made the output of the check unstable, so it was disabled.

        For example you might write some code that tests a variable and then executes one or two possible code paths (an IF statement). Without optimization the compiler will generate code to test the variable and branch. With extensive optimization the compiler might realize that there is no possible code path from main() through to this function that would cause the test to result in a true result and so optimize away the true result code path completely, removing the test and simply executing the false path.

        Often it is desirable to disable or at least reduce the optimization level to minimum to aid debugging, because the assembler code produced with maximum optimization can be pretty much unreadable and break things like watching variables in the debugger because they get optimized down to multiple registers. It's really not an easy problem to solve.

        --
        const int one = 65536; (Silvermoon, Texture.cs)
        • (Score: 2) by frojack on Sunday March 02 2014, @08:46PM

          by frojack (1554) on Sunday March 02 2014, @08:46PM (#9731) Journal

          The problem is that code being unreachable is only determined after optimization,

          Wait, What?

          An arbitrary GOTO, not subordinate to any conditional means that ALL subsequent lines up to the next label are unreachable. The second consecutive goto was totally unconditional.

          You don't need optimization passes to detect that, it is basic compiler theory,

          Hell, In the past have used code Editors built into IDEs that will detect that.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 2) by mojo chan on Sunday March 02 2014, @09:55PM

            by mojo chan (266) on Sunday March 02 2014, @09:55PM (#9760)

            Sure, I was just stating why they disabled the feature. It wasn't reliable, the output changed based on the optimization level.

            --
            const int one = 65536; (Silvermoon, Texture.cs)
    • (Score: 1) by MichaelDavidCrawford on Sunday March 02 2014, @04:05AM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Sunday March 02 2014, @04:05AM (#9385) Homepage Journal

      I can no longer get into just about the most important email account I've ever had.

      So I'm going to type all of my passwords into a text file, but that text file on a truecrypt volume, and then... ... WAIT FOR IT! ...

      spraypaint that truecrypt's password on the side of a wall in one of Vancouver's rougher neighborhoods.

      --
      Yes I Have No Bananas. [gofundme.com]
  • (Score: 5, Interesting) by AudioGuy on Sunday March 02 2014, @12:41AM

    by AudioGuy (24) on Sunday March 02 2014, @12:41AM (#9297) Journal

    if ((err = SSLHashSHA1.update(&hashCtx, &serverRandom)) != 0)
                    goto fail;
            if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) != 0)
                    goto fail;
                    goto fail;
            if ((err = SSLHashSHA1.final(&hashCtx, &hashOut)) != 0)
                    goto fail;

    I wish *my* C bugs were so obvious as this one. ;-)

    This means that not only was this missed, but that the function, a vital security function, was never even tested.

    The comments on Bruces blog are worth reading as well. https://www.schneier.com/blog/archives/2014/02/was _the_ios_ssl.html [schneier.com]

    It is very hard to see how this one could have been accidental.

    • (Score: 5, Interesting) by FatPhil on Sunday March 02 2014, @12:50AM

      by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Sunday March 02 2014, @12:50AM (#9305) Homepage
      > It is very hard to see how this one could have been accidental.

      Let's pour some kerosene on that...

      http://daringfireball.net/2014/02/apple_prism
      """
      Jeffrey Grossman, on Twitter:
      > I have confirmed that the SSL vulnerability was introduced in iOS 6.0. It is not present in 5.1.1 and is in 6.0.
      iOS 6.0 shipped on 24 September 2012.

      According to slide 6 in the leaked PowerPoint deck on NSA&#226;&#8364;&#8482;s PRISM program, Apple was &#226;&#8364;&#339;added&#226;&#8364; in October 2012.

      These three facts prove nothing; it&#226;&#8364;&#8482;s purely circumstantial. But the shoe fits.
      """
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 1) by forsythe on Sunday March 02 2014, @02:30AM

      by forsythe (831) on Sunday March 02 2014, @02:30AM (#9354)

      was never even tested

      If anything speaks to it being "enemy action", it's this. Anybody could make this error accidentally. Perhaps the chances are astronomically low of somebody making this error and not noticing it, but perhaps the codebase is astronomically large. I can even see something like this being ignored in a diff-only peer review, though that's a bigger stretch.

      Somehow engineering circumstances to have this not tested, however, pretty much has to be intentional.

      • (Score: 2) by frojack on Sunday March 02 2014, @04:22AM

        by frojack (1554) on Sunday March 02 2014, @04:22AM (#9394) Journal

        But, AS I understand the bug, that line of code was meant to detect a bad cert, or man in the middle attack.

        Even in wide scale testing, you are not likely to encounter that in the real world. And in this case, it would just allow the site to load as normal. You'd be owned, but none the wiser. The code would pass the test.

        Adam Langley on his Blog [imperialviolet.org] coded up a cute harmless little demonstrator for this bug.
        This is the direct URL https://www.imperialviolet.org:1266/ [imperialviolet.org]

        Chrome just says No Way.
        Firefox spits confusing jargon that translates to No Way.
        Even cursty old Kong catches this.

        So unless you has a deliberately created bad web site to test with, you would never see this bug. Seems accidental that it was found at all.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 4, Insightful) by forsythe on Sunday March 02 2014, @05:07AM

          by forsythe (831) on Sunday March 02 2014, @05:07AM (#9406)

          Sure, in the real world this bug would be hard to detect. But I find it hard to believe that anyone at Apple would approve a function for detecting bad certs that didn't even have a test record including data that [should have] failed sslRawVerify (which, as I understand it, is the key step that the goto skips). That's the sort of thing big, professional software companies are supposed to do, isn't it? That leaves a few possibilities: either the test record was doctored, the test cases were carefully constructed not to expose this bug, or there simply weren't any tests intended to cover this case.

          Hanlon's Razor says the third case is most likely, but I'm not so sure I should trust it in this case.

    • (Score: 4, Interesting) by chr1sb on Sunday March 02 2014, @04:36AM

      by chr1sb (2778) on Sunday March 02 2014, @04:36AM (#9401)
      The old maxim "Never attribute to malice what can be attributed to incompetence" applies here too. No human is required to write the code in that specific way for the code to exist in that form. This is the kind of defect that can be introduced by e.g. a merge issue between branches. When code has been modified in both branches, the merge tool will do its best but can easily break the code in this way. This is another reason for good unit tests and code reviews, as pointed out by others. Now, if the unit tests were modified to *no longer* detect that circumstance, then I would be suspicious.

      To reduce the likelihood of these issues arising, the code can be structured in a different way, with no need for gotos, more protection against such merge issues and with structural flaws being more obvious:

      static OSStatus
      SSLVerifySignedServerKeyExchange(SSLCont ext *ctx, bool isRsa, SSLBuffer signedParams,
                                       uint8_t *signature, UInt16 signatureLen)
      {
          OSStatus err;

          // some code

          if (   (0 != (err = SSLHashSHA1.update(&hashCtx, &serverRandom)))
              || (0 != (err = SSLHashSHA1.update(&hashCtx, &signedParams)))
              || (0 != (err = SSLHashSHA1.final(&hashCtx, &hashOut))))
          {
              SSLFreeBuffer(&signedHashes);
              SSLFreeBuffer(&hashCtx);
              return err;
          }

          // more code
      }

    • (Score: 2) by maxwell demon on Sunday March 02 2014, @01:15PM

      by maxwell demon (1608) on Sunday March 02 2014, @01:15PM (#9579) Journal

      Dijkstra obviously was right.

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 4, Insightful) by regift_of_the_gods on Sunday March 02 2014, @12:49AM

    by regift_of_the_gods (138) on Sunday March 02 2014, @12:49AM (#9304)

    That's easier said than done, because lint or maxing out the compiler warnings usually produces a bunch of false positives. However, the bar for a security or crypto library checkin should be higher than for run-of-the-mill code, justifying additional effort to clean up the nuisance warnings.

  • (Score: 0) by kaalon on Sunday March 02 2014, @01:43AM

    by kaalon (499) on Sunday March 02 2014, @01:43AM (#9331)

    So a bug that was introduced after testing or did Apple programmers do no testing on this module? The compiler should be able to warn this as unreachable.

  • (Score: 1) by lajos on Sunday March 02 2014, @02:19AM

    by lajos (528) on Sunday March 02 2014, @02:19AM (#9349)

    Auditing systems?

    Like as in running git blame?

    And this dude's a freaking "professor of Computer Science in Columbia University and Chief Technologist of the Federal Trade Commission"?

    And bug? What bug? It was a feature added for the government. He should know, if he doesn't, just would need to call over to some other government branch and ask. Or does he know? And this is just a coverup? Hmmmm....

  • (Score: 0, Offtopic) by MichaelDavidCrawford on Sunday March 02 2014, @02:47AM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Sunday March 02 2014, @02:47AM (#9362) Homepage Journal

    She and I are actually down with Apple not supporting her box anymore.

    However she takes very good care of it, uses it only to do some light email and web browsing. I expect her iMac to last another ten years.

    Ditto for Aunt Peggy's G3 iBook. Aunt Peggy doesn't have the first clue that she's vulnerable; her iBook transmits eMail to and from my Mom, myself my sister and our Cousin Glenn just fine.

    For either of them to obtain a patch for this SSL Exploit, they would have to spend a minimum of a thousand bucks apiece just to obtain a one-line fix.

    Off-Topic but I will say it anyway:

    As a former Apple System Software Engineer, I am privy to the knowledge that somewhere within the Classic Mac OS System software was, quite possibly still is in Mac OS the following line of code:

    procedure GetDown( AndBoogie: OneMoreTime )

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 3, Interesting) by useless on Sunday March 02 2014, @03:18AM

      by useless (426) on Sunday March 02 2014, @03:18AM (#9372)

      They don't need an update/patch. OSX stopped supporting PPC long before this bug was introduced. There, saved your family over two grand!

    • (Score: 1) by Indigo on Sunday March 02 2014, @10:42PM

      by Indigo (1756) on Sunday March 02 2014, @10:42PM (#9795)

      As a former Macintosh developer (way back - circa 1988-1992), that absolutely makes my day. Would love to hear any details about it you may recall.