Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday October 19 2015, @11:05PM   Printer-friendly
from the found-and-fixed dept.

What they've found is that there's a companion memory leak (CVE-2015-5333) and buffer overflow (CVE-2015-5334) in the SSL replacement candidate.

The researchers from Qualys (their notice published here) said they were trying to see if a remote code execution attack is feasible against vulnerabilities they've turned up in OpenSMTPD (which earlier this month hit version 5.7.3).

“Because we could not find one in OpenSMTPD itself, we started to review the malloc()s and free()s of its libraries, and eventually found a memory leak in LibreSSL's OBJ_obj2txt() function; we then realized that this function also contains a buffer overflow (an off-by-one, usually stack-based).”

The memory leak provides a path for an attacker to cause a denial-of-service attack, and also permits triggering of the buffer overflow.

The LibreSSL team has released fixes for OpenBSD.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by engblom on Tuesday October 20 2015, @05:24AM

    by engblom (556) on Tuesday October 20 2015, @05:24AM (#252179)

    I have many times said it: C is too difficult for humans.

    I have very deep respect for the developers of OpenBSD and the quality of their code. It is very seldom any project in any language got as few serious bugs as the OpenBSD people make. Still now and then, mistakes happens because of manual memory management, even for these very skilled people, and despite they do code review.

    We are not able to begin replacing C for OS development at this very moment, but I really wish more efforts would be put in getting a replacement language ready. Rust looks promising, but not fully there yet. This is something that should have been done long time ago.

    Imagine what these very skilled developers could accomplish if they could spend less time searching for memory management bugs and instead concentrate on the algorithms and making sure the code they wrote really does what they want. I am sure we would see a lot faster progress with less bugs as more time would be spent on correctness rather than memory management.

    Then there are other benefits a more modern language could bring than just memory management. Newer language could bring in different kind of atomic operations for threads, making race conditions and mutex lock situations something of a past.

    Starting Score:    1  point
    Moderation   0  
       Troll=1, Insightful=1, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 5, Interesting) by NCommander on Tuesday October 20 2015, @05:44AM

    by NCommander (2) Subscriber Badge <michael@casadevall.pro> on Tuesday October 20 2015, @05:44AM (#252184) Homepage Journal

    Unfortunately, the alternatives aren't great, especially with interop with other code. (I'm discounting C++, and ObjC as they're directly descended from C). Of the alternatives I've looked at:

    Go was a good attempt, but I honestly think they dumbed down the language far too much. I'm not super experienced with it, but my general feeling is I'm fighting the language to do anything remotely complicated. Furthermore, the tools are idiotic: the way "go get" works has no bearing for the realities of source control, shared libraries only became a thing with 1.5, and binding to anything but C appears to require a C-shim. On the plus side, goroutines and channels are very nice features, but they suffer from the simplicity of everything else.

    D has serious issues with its toolchain that make it a non-starter for me. It also feels too close to C for me to avoid falling into the same pitfalls, but I'm not overly experienced with it beyond a bit of sample code.

    Rust is unproven and doesn't have Google-level backing, and I'm not convinced they won't change the language schematics again.

    Haskell (and other functional programming languages) are a massive paradigm shift. I'm not saying functional programming is bad, but I struggled to get my mind around it, and I'm still not confident in the language. In terms of performance, it has a legacy of being slow, though I believe recent GHC builds are close to C level.

    Ada has a lot of money behind it, but the language is extremely pedantic (not necessarily a bad thing), and being Pascal based with an odd syntax for classes doesn't help much. The situation is made worse that setting up GNAT is painful, and AdaCore is pure-GPL licensed; a quirk with Ada is that due to the way generics work, your headers have to be under a non-copyleft program to avoid GPL bleedover; that's why the GMGPL exists for Ada code. Furthermore, GNAT sometimes lags behind AdaCore on language features (it was about 1-2 years Ada2005 support finally appeared in GCC proper; I'm unsure of the state of Ada2012). This issue is further compounded that Ada doesn't have the greatest binding support, and any complex C/C++bindings are a project in and of themselves.

    Modula-3 (yes I looked at this) is actually a surprisingly nice language, and handles hiding very well via the REVEAL clause. However, its toolchain, while BSD-licensed (I think) is incredibly quirky. Binding support appears poor.

    On paper, Cython sounds great, but its differences from standard Python cause some libraries to simply not work or behave strangely. I've heard horror stories trying to get Python's high-level math libraries working with it.

    --
    Still always moving
    • (Score: 3, Informative) by TheRaven on Tuesday October 20 2015, @08:47AM

      by TheRaven (270) on Tuesday October 20 2015, @08:47AM (#252214) Journal

      Unfortunately, the alternatives aren't great, especially with interop with other code. (I'm discounting C++, and ObjC as they're directly descended from C)

      You really should look at C++ again. With C++14, writing memory-safe code is pretty easy, as long as you're disciplined to never use 'new' (always create objects with make_unique() or make_shared()). Importantly, the things that can cause memory safety problems are now syntactically distinct from safe operations. This is even more true with the GSL Owner and similar.

      Objective-C has better support for public interfaces that maintain ABI stability (and is also easy to bridge from other high-level languages) and with ARC Objective-C++ now works very nicely (for example, you can put Objective-C objects into C++ collections and have the memory management work correctly). Using C++ internally and Objective-C for the coarse-grained interfaces gives quite a nice way of writing libraries.

      --
      sudo mod me up
      • (Score: 3, Insightful) by NCommander on Tuesday October 20 2015, @12:50PM

        by NCommander (2) Subscriber Badge <michael@casadevall.pro> on Tuesday October 20 2015, @12:50PM (#252267) Homepage Journal

        This fairly well summarizes why I don't C++ as a good solution: The C++ FQA [yosefk.com]. The language is fundamentally broken and does not prevent these problems in the real world. Plus when using the STL, and smart pointers and such, when something breaks, it can be absolutely close to impossible to debug. I'm not qualified to speak in-depth on ObjC so I won't even try, though my impression is much of the same points below apply. The new class may solve the object leak problem, but it wouldn't avoid the off by one bug that was found in LibreSSL.

        Furthermore, unless you're not going to interface with any external libraries, its close to impossible to use smart pointers or smart objects exclusively; too many things use them. Just off the top of my head: BSD sockets requires pointers in and out for send/recv which is an easy off by one bug. Sockets still doesn't have an abstraction in the STL (and don't tell me Boost is a solution). You can't bolt on sane memory protection for an environment that wasn't designed for it, and a language that actively has to support backwards compatibility. Look at how clumsy generics are with Java; a side effect of the all powerful backwards compatibility gods. The STL has had smart pointers for years, auto_ptr was added in C++03 and does essentially the same thing, and has been ignored for mostly the same reason.

        The problem is mode worse that C++ libraries also do not define a standard ABI so you're tied to a specific version of a compiler that generated them. This is why if you use Qt or Boost, you have to specify the compiler you're using. So even if you could wrap all your non-safe calls into some sorta abstraction library, you're still stuck in this painful environment that you need to provide versions of that library for each compiler a user may want to use, or have them compile it yourself.

        The language is fundamentally broken, and the C++ technical committee keeps trying to plaster around its flaws, in effect making the language even more impossible to understand. For example, can you name the four types of C++ specific casts, and what there difference is? How about when dealing with the case of multiple interhance, what happens if I don't declare virtual on one of my subclasses methods? What happens then. Static have various different meanings depending on where and how you use it.

        C at least has the virtue of being somewhat simple, and thus possibly graspable by mere mortal. It's flawed, and badly so, but not to the extent C++ is.

        --
        Still always moving
        • (Score: 2) by TheRaven on Tuesday October 20 2015, @04:46PM

          by TheRaven (270) on Tuesday October 20 2015, @04:46PM (#252364) Journal
          Every point that you make is either true of all languages that run on a *NIX system, not true of C++11, or not true for the last decade or so.
          --
          sudo mod me up
          • (Score: 2) by NCommander on Wednesday October 21 2015, @07:21AM

            by NCommander (2) Subscriber Badge <michael@casadevall.pro> on Wednesday October 21 2015, @07:21AM (#252625) Homepage Journal

            Can you cite actual examples disproving my claims? I like being proven wrong, but I'm not going to be dissuaded by "its no longer true".

            I've had to work with a C++14 codebase as of late, and most of the pain is still there, and a lot of other headaches (templates for one) are still a core part of the language, made worse by the fact that all the new stuff is dependent on templates.

            --
            Still always moving
    • (Score: 0) by Anonymous Coward on Tuesday October 20 2015, @01:22PM

      by Anonymous Coward on Tuesday October 20 2015, @01:22PM (#252277)

      Thank you for writing a good review about alternative programming languages without stooping to the childish level of ignorant name calling that many "programmers" display. They're just scared their limited skills and inability to learn new languages will finally be exposed.

  • (Score: 2) by FatPhil on Tuesday October 20 2015, @08:25AM

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Tuesday October 20 2015, @08:25AM (#252209) Homepage
    C coders who insist that all variables need to be declared at the top of the function they're used in are the problem. If they'd have gone straight to using a more local scope for that pointer, the memory would not have leaked, and the buffer overflow would have become impossible. Learning to use scope sensibly is not hard. In fact it's trivial. It's just as easy to teach people to use the smallest scope possible as it is to teach them to use the largest-but-one scope possible. C doesn't need fixing, it's just that plenty of C coders need fixing. But plenty of coders in most languages need fixing, this doesn't make C any worse than any other language.

    (But in other news - Digital Mars D is the better C that all of the other better C's wish they were.)
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 2) by engblom on Tuesday October 20 2015, @09:33AM

      by engblom (556) on Tuesday October 20 2015, @09:33AM (#252221)

      C coders who insist that all variables need to be declared at the top of the function they're used in are the problem. If they'd have gone straight to using a more local scope for that pointer, the memory would not have leaked, and the buffer overflow would have become impossible. Learning to use scope sensibly is not hard. In fact it's trivial. It's just as easy to teach people to use the smallest scope possible as it is to teach them to use the largest-but-one scope possible. C doesn't need fixing, it's just that plenty of C coders need fixing. But plenty of coders in most languages need fixing, this doesn't make C any worse than any other language.

      (But in other news - Digital Mars D is the better C that all of the other better C's wish they were.)

      First, the OpenBSD coders are not needed to get "fixed". They are already very skilled. Still they do mistakes because it is human to err. C allows a whole class of mistakes that comes only from the manual memory management. Most of the vulnerabilities reaching the news is because of mistakes because of C's memory management and not even the most skilled programmers are immune against failing.

      Second, in many cases it is not about the scope as you often pass around pointers to whatever data structure you are using. The error is because of where the declaration is done, but in checking you stay inside of the memory area you should when you use the data structure the pointer pointed at.

      The mistake in the article was a off-by-one error. Most modern languages makes a program to crash if you are off-by-one. With C you are in a more dangerous situation.

      • (Score: 2) by engblom on Tuesday October 20 2015, @09:35AM

        by engblom (556) on Tuesday October 20 2015, @09:35AM (#252222)

        The error is because of where the declaration is done, but in checking you stay inside of the memory area you should when you use the data structure the pointer pointed at.

        I meant:

        The error is not in where the declaration is done, but in checking you stay inside of the memory area you should when you use the data structure the pointer pointed at.

      • (Score: 0) by Anonymous Coward on Tuesday October 20 2015, @10:58AM

        by Anonymous Coward on Tuesday October 20 2015, @10:58AM (#252238)

        >The mistake in the article was a off-by-one error. Most modern languages makes a program to crash if you are off-by-one. With C you are in a more dangerous situation.

        Which is readily solved by a custom allocator. A multitude of projects go that route.
        But in this specific case I agree with the sentiment. C is obviously too hard for humans such that do a "snprintf(buf, buf_len, ".%s", bndec)". :-)

        • (Score: 2) by LoRdTAW on Tuesday October 20 2015, @01:25PM

          by LoRdTAW (3755) on Tuesday October 20 2015, @01:25PM (#252279) Journal

          C is obviously too hard for humans such that do a "snprintf(buf, buf_len, ".%s", bndec)". :-)

          I think we can blame the C libraries for this.

  • (Score: 0) by Anonymous Coward on Tuesday October 20 2015, @12:39PM

    by Anonymous Coward on Tuesday October 20 2015, @12:39PM (#252265)

    Calculus is also too difficult for humans. Doing all those calculations just to make a rocket go seems wrong. We should all be using paper planes. And live in caves.

    Then there are other benefits a more modern language could bring than just memory management

    Until the language no longer works for what you need it to do. So you modify the language until you end up with the same "problems" you intended the "more modern" language to solve.

    • (Score: 0) by Anonymous Coward on Tuesday October 20 2015, @02:08PM

      by Anonymous Coward on Tuesday October 20 2015, @02:08PM (#252298)

      I like how he thinks that 'a modern language' will somehow magically solve all problems. You're not going to get rid of complexity; you can only 'abstract it away', i.e. move it somewhere else [imgur.com]. You're not going to solve anything that way.