Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday December 12 2017, @03:51AM   Printer-friendly
from the fuzzy-illogic dept.

Submitted via IRC for SoyCow8317

Research presented this week at the Black Hat Europe 2017 security conference has revealed that several popular interpreted programming languages are affected by severe vulnerabilities that expose apps built on these languages to attacks.

The author of this research is IOActive Senior Security Consultant Fernando Arnaboldi. The expert says he used an automated software testing technique named fuzzing to identify vulnerabilities in the interpreters of five of today's most popular programming languages: JavaScript, Perl, PHP, Python, and Ruby.

[...] The researcher released XDiFF as an open source project on GitHub. A more detailed presentation of the testing procedure and all the vulnerabilities is available in Arnaboldi's research paper named "Exposing Hidden Exploitable Behaviors in Programming Languages Using Differential Fuzzing."

Source: https://www.bleepingcomputer.com/news/security/secure-apps-exposed-to-hacking-via-flaws-in-underlying-programming-languages/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by DannyB on Tuesday December 12 2017, @05:13PM (1 child)

    by DannyB (5839) Subscriber Badge on Tuesday December 12 2017, @05:13PM (#608796) Journal

    C, and probably other languages, like Java, aren't so much the problem. It is their libraries that are the problem.

    Why did anyone ever think that null terminated strings were a good idea?

    Why did someone think that library string operations given a string pointer shouldn't also require a max buffer limit?

    And then there are formatting vulnerabilities. Just as one example, printf which accepts a complex formatting string. Lazy application programmers simply pass the string they want to print as the formatting string with no additional arguments.

    Now I would point out that Java solves these problems. But does not eliminate other problems. See my PHP with CURL post a few posts above.

    Who ever thought that running an Applet in a Browser that could have all sorts of complex interactions with the browser's JavaScript was a good idea? (And substitute "Applet" with any of: Flash, ActiveX and Silverblight)

    Even ignoring browser Applets, a server side Java application can have problems with complex interactions of libraries similar to the PHP / CURL interaction problem. And I suppose so could C or C++ have similar complex library interaction problems.

    I think the compiler is the very least of anyone's worries. Compilers tend to be extremely solid. They generate fixed machine code from fixed source code. Even in Java, the bytecode is generated from source, and then at execution time is compiled into the specific type of machine code to fit your actual hardware processor. These processes are not generally where the vulnerabilities lie. (Although C was used as the example of the classic Trusting Trust problem, what if your compiler binary was modified so that when it compiled the compiler, it realized it was doing so and generated a compromised binary of the compiler from clean source code.)

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Arik on Tuesday December 12 2017, @05:52PM

    by Arik (4543) on Tuesday December 12 2017, @05:52PM (#608818) Journal
    "C, and probably other languages, like Java, aren't so much the problem. It is their libraries that are the problem."

    Oh libraries are a big part of it. But in practice, they might as well be the language, they're how the language is actually used and in most cases they're why it's used.

    But the languages themselves, in the more idealistic meaning, are still part of this. We didn't make higher and higher levels of language just for shits and giggles, we did it because by moving the level of abstraction higher you offload more of the work onto the compiler/interpreter. That means less work you have to do to get the computer to do something, which seems like an obvious win in many cases.

    But the trade-off is that the more abstractions lay between you and the actual code, the more difficult it is when you look at it from the security standpoint. From the standpoint not of 'how do we make it do x?' but instead 'how do we make sure it doesn't do anything besides x?'

    "Why did anyone ever think that null terminated strings were a good idea?"

    Why did anyone ever think it was a good idea to name the special termination character null, which means 0, which is another character entirely? That's my question.

    But to answer your question, it seems like a very good idea to store strings in as compact a form as possible when you're working with systems that have ~1kb RAM.

    This is actually an issue that relates back to the level of abstraction, in an interesting way. If you're programming directly with a first or second order language, you have to think about these things directly, and in a form that pretty much mirrors the actual reality of what's happening in the program. So you are able to think about how many bytes you have available to terminate a string, about how many physical read requests you're generating, and so forth. And that means you have to know what machine you're writing for, it means portability between different sorts of computer systems is just not part of the design. When you put more abstractions between yourself and that reality, you lose a lot of your ability to control the hardware, and the idea is explicitly to trust that the compiler or interpreter is going to do that thinking for you so you can focus on other things. And so you get portability designed in, C as 'portable assembly' so when you use it you should be able to forget all about the physical hardware and just say 'store these strings' and then 'return the strings stored earlier' and not need to worry at all about what they're stored on in the meantime or how they're terminated or any of that.

    But as your example points out, that's not really what you get. You're still writing to a machine, it's just that it's no longer the physical machine that the program will run on, it's a sort of abstract machine, and you're still worrying about how many bytes are being used to terminate your strings. So java is another incarnation of the same idea, and now that's made explicit, and you're programming to a VM. Which at least has better specifications available now that it's explicit.

    "Who ever thought that running an Applet in a Browser that could have all sorts of complex interactions with the browser's JavaScript was a good idea?"

    People who have money and buy ads, unfortunately.

    --
    If laughter is the best medicine, who are the best doctors?