Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday July 25 2019, @05:52PM   Printer-friendly
from the pics-or-it-didn't-happen dept.

Alleged critical VLC flaw is nothing to worry about -- and is nothing to do with VLC

There has been a degree of confusion over the last few days after news spread of a supposed vulnerability in the media player VLC. Despite being labelled by security experts as "critical", VLC's developers, VideoLAN, denied there was a problem at all.

And they were right. While there is a vulnerability, it was in a third-party library, not VLC itself. On top of this, it is nowhere near as severe as first suggested. Oh -- and it was fixed over a year ago. An older version of Ubuntu Linux was to blame for the confusion.

The problem actually exists in a third-party library called libebml, and the issue was addressed some time ago. The upshot is that if you have updated VLC within the last year, there is no risk whatsoever. VLC's developers are understandably upset at the suggestion that their software was insecure.

Also at Tom's Hardware, Boing Boing, and The Register.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by DannyB on Thursday July 25 2019, @06:49PM (25 children)

    by DannyB (5839) Subscriber Badge on Thursday July 25 2019, @06:49PM (#871190) Journal

    It sounds like VNC dynamically linked to the vulnerable library.

    I've noticed that the trend in Linux had been not to statically link libraries into applications, but depend on things being installed in the base system. (Ideally the application is packaged for the distribution, and declares those dependencies to the packaging system so it is all just like magic.)

    Is dynamic linking still a good idea or not?

    Dynamic Linking Advantages:
    * smaller disk footprint -- but who cares these days
    * if the distribution promptly updates a vulnerable library, than many applications are fixed

    Static Linking Advantages: (Dynamic Linking Disadvantages)
    * applications that are not packaged or up to date in the distribution can be much easier to set up. Sometimes just one executable. See: AppImage.
    Example: latest Gimp not on stretch and buster was too long to wait for. AppImage package of Gimp was perfect, just download, and execute.

    Static Linking Disadvantages:
    * the developer of the app needs to be on top of security updates to 3rd party libraries
    * applications are larger in size

    Building an application with libraries statically linked seems to feel like packaging it in a container (ala Docker). What's in it just works. You don't need to look inside. All dependencies included. Just plug it in.

    In practice, keeping a distribution up to date, everything just works. But sometimes it feels like too long before a new distribution with updated apps. Or with ubuntu, it feels like, unless you get on the LTS version, you will have to keep upgrading the distribution. But if on LTS, you might not get newer major app upgrades.

    Just pontificating

    --
    The lower I set my standards the more accomplishments I have.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Thursday July 25 2019, @06:56PM (7 children)

    by Anonymous Coward on Thursday July 25 2019, @06:56PM (#871193)

    Really, with musl libc, static linking is very attractive nowadays. However, there are quite a few large libraries that are required for major pieces of software that absolutely refuse to play nicely. AppImage/flatpak/containers/etc cgroup shenanigans all have big drawbacks WRT network/filesystem transparency and make it very easy to introduce privelege escalations unless you're very careful and knowledgeable.

    • (Score: 2) by DannyB on Thursday July 25 2019, @07:17PM (6 children)

      by DannyB (5839) Subscriber Badge on Thursday July 25 2019, @07:17PM (#871208) Journal

      There is an effort to build an Open JDK (eg, Java development kit) against musl libc. Actually I think Azul's Zulu offers one, or at least built for Alpine.

      Java itself is a gigantical dependency of otherwise small applications that depend on it.

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 0) by Anonymous Coward on Thursday July 25 2019, @07:27PM (1 child)

        by Anonymous Coward on Thursday July 25 2019, @07:27PM (#871213)

        Java is a work of the devil!

        To be any good, you have to work in straight up binary. No abstractions mean no distractions. You cut to the chase. Do it right, and the kernel will fit on a floppy again, along with VLC, a browser, an email and bittorrent client, and will find cheap tickets to Vegas.

        • (Score: 4, Informative) by DannyB on Thursday July 25 2019, @07:34PM

          by DannyB (5839) Subscriber Badge on Thursday July 25 2019, @07:34PM (#871219) Journal

          That works. As long as development cost is no object.

          However businesses are optimizing for dollars, not for bytes and cpu cycles. If I need an extra 64 GB of ram, but can beat my competitor to market, my manager won't even blink.

          --
          The lower I set my standards the more accomplishments I have.
      • (Score: 2) by c0lo on Friday July 26 2019, @05:09AM (3 children)

        by c0lo (156) Subscriber Badge on Friday July 26 2019, @05:09AM (#871333) Journal

        Java itself is a gigantical dependency of otherwise small applications that depend on it.

        Now, tell me about Perl, Python, NodeJS and so many others.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 2) by DannyB on Friday July 26 2019, @01:20PM (2 children)

          by DannyB (5839) Subscriber Badge on Friday July 26 2019, @01:20PM (#871440) Journal

          Can Python, NodeJS, and God forbid Perl actually:
          1. be as large of a download
          2. be as large of an install
          3. have the largest number of files on disk
          4. have the biggest on disk pawprint
          and most importantly
          5. use as much memory . . .

          . . . as Java ?

          IMO, Java has them all beat.

          --
          The lower I set my standards the more accomplishments I have.
          • (Score: 2) by c0lo on Friday July 26 2019, @02:03PM (1 child)

            by c0lo (156) Subscriber Badge on Friday July 26 2019, @02:03PM (#871471) Journal

            Really? I just don't feel this is something so totally out of proportion, would you mind checking it again?

            Excluding the docos (docs and man) and legals, here's what my java-11 installation looks like:


            /usr/lib/jvm/java-11-openjdk-amd64$ ls
            bin conf docs include jmods legal lib man release

            /usr/lib/jvm/java-11-openjdk-amd64$ find bin conf include jmods lib -type f -print | wc -l
            158

            /usr/lib/jvm/java-11-openjdk-amd64$ du -h -c bin conf include jmods lib
            484K bin
            16K conf/security/policy/limited
            12K conf/security/policy/unlimited
            32K conf/security/policy
            36K conf/security
            4.0K conf/management
            44K conf
            12K include/linux
            228K include
            186M jmods
            68K lib/jli
            4.0K lib/security
            35M lib/server
            4.0K lib/jfr
            178M lib
            364M total

            364 MB in 158 files.

            ----

            Now, let me try the nodejs installation:

            ~/bin/node$ ls
            bin CHANGELOG.md include lib LICENSE README.md share

            ~/bin/node$ find bin include lib share -type f -print | wc -l
            3620

            ~/bin/node$ du -h -c bin include lib share
            38M bin
            24K include/node/libplatform
            12K include/node/openssl/archs/linux-armv4/no-asm/include/openssl
            32K include/node/openssl/archs/linux-armv4/no-asm/include
            12K include/node/openssl/archs/linux-armv4/no-asm/crypto/include/internal
            16K include/node/openssl/archs/linux-armv4/no-asm/crypto/include
            24K include/node/openssl/archs/linux-armv4/no-asm/crypto
            60K include/node/openssl/archs/linux-armv4/no-asm
            12K include/node/openssl/archs/linux-armv4/asm/include/openssl
            32K include/node/openssl/archs/linux-armv4/asm/include
            12K include/node/openssl/archs/linux-armv4/asm/crypto/include/internal
            16K include/node/openssl/archs/linux-armv4/asm/crypto/include
            24K include/node/openssl/archs/linux-armv4/asm/crypto
            60K include/node/openssl/archs/linux-armv4/asm
            ...
            ...
            27M lib/node_modules/npm
            27M lib/node_modules
            27M lib
            8.0K share/systemtap/tapset
            12K share/systemtap
            16K share/doc/node
            20K share/doc
            20K share/man/man1
            24K share/man
            60K share
            69M total

            69 MB in 3620 files

            --
            https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
            • (Score: 2) by DannyB on Friday July 26 2019, @03:04PM

              by DannyB (5839) Subscriber Badge on Friday July 26 2019, @03:04PM (#871496) Journal

              Thank you! That is very interesting.

              Java runtime: fewer but larger files.

              NodeJS: many smaller files

              I suspect NodeJS cannot compete with Java on performance.

              The Java runtime runs java bytecode. It starts out interpreted. Continuous dynamic profiling identifies functions using disproportionate cpu. Those are immediately compiled to native code by the C1 compiler, and put on a list to soon be recompiled by the C2 compiler. C1 quickly compiles simple code. C2 spends time generating highly optimized code.

              C2 aggressively inlines code. C2 has a 'global view' of the entire running application unlike an ahead of time compiler, such as C, or Golang, etc.

              If a class is dynamically reloaded during runtime, causing some aggressively inlined code to become stale, then those functions are de-optimized back to being byte code interpreted again. If they still use disproportionate cpu they will get compiled again by C1 then C2.

              I was just reading about Vectorised Byte Operations in C2 [github.io].

              C2 is also able to use instructions specific to your actual processor model. Unlike an ahead of time compiler.

              --
              The lower I set my standards the more accomplishments I have.
  • (Score: 2) by ikanreed on Thursday July 25 2019, @07:01PM (3 children)

    by ikanreed (3164) Subscriber Badge on Thursday July 25 2019, @07:01PM (#871197) Journal

    You missed a major dynamic linking advantage.

    Inter-oper-a-bil-ity. Application A can use the same damned apache installation and configuration as application B, because A and B dynamically link to the same centrally sourced cgi-bin. Application A spits out config files that Application B can use, because they both dynamically link to the same object serialization library.(Don't @ me that two different applications shouldn't be sharing a serialization format, it happens). Application A has an SSL standard that matches Application B elsewhere on the network, because they're both patched.

    If it were as easy as "Just static link dipshits" people would just static link.

    • (Score: 2) by DannyB on Thursday July 25 2019, @07:19PM (1 child)

      by DannyB (5839) Subscriber Badge on Thursday July 25 2019, @07:19PM (#871210) Journal

      (Don't @ me that two different applications shouldn't be sharing a serialization format, it happens).

      Okay, I won't.

      But isn't using cgi-bin not such a good idea maybe?

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 2) by ikanreed on Thursday July 25 2019, @07:27PM

        by ikanreed (3164) Subscriber Badge on Thursday July 25 2019, @07:27PM (#871214) Journal

        I dunno, I haven't done a proper apache stack in ages.

    • (Score: 2) by darkfeline on Saturday July 27 2019, @08:20AM

      by darkfeline (1030) on Saturday July 27 2019, @08:20AM (#871814) Homepage

      That sounds like a disadvantage, because application A invariably wants version X of the dependency and application B wants version Y of the dependency and God help you get both versions X and Y installed without breaking application C which wants version Z of the dependency.

      You don't need dynamic linking to share a configuration file.

      --
      Join the SDF Public Access UNIX System today!
  • (Score: 3, Insightful) by Anonymous Coward on Thursday July 25 2019, @07:28PM (2 children)

    by Anonymous Coward on Thursday July 25 2019, @07:28PM (#871215)

    License concerns would be another reason.

    You can dynamically link LGPL code from closed source stuff without issue, but linking it statically violates the license.

    • (Score: 2) by DannyB on Thursday July 25 2019, @07:44PM

      by DannyB (5839) Subscriber Badge on Thursday July 25 2019, @07:44PM (#871225) Journal

      Excellent observation!

      One that I miss. As a Java developer JAR files (eg, libraries) are always dynamically linked. You can include them in a packaged Java application, but even then, the end user is easily able to simply drop in newer updated JAR files in compliance with LGPL. Even in an unsupported abandonware. Thus I didn't think about this implication of static linking.

      --
      The lower I set my standards the more accomplishments I have.
    • (Score: 2) by TheRaven on Friday July 26 2019, @10:02AM

      by TheRaven (270) on Friday July 26 2019, @10:02AM (#871392) Journal
      No it doesn't. You can statically link LGPL'd things as long as you provide (at least) the .o / .a files from your build so that someone else can statically link against a different version.
      --
      sudo mod me up
  • (Score: 4, Insightful) by ledow on Thursday July 25 2019, @07:29PM (5 children)

    by ledow (5567) on Thursday July 25 2019, @07:29PM (#871217) Homepage

    "* the developer of the app needs to be on top of security updates to 3rd party libraries"

    This, as this problem shows is a HUMONGOUS problem, out of the same scale as anything else you mention.

    If VLC has statically linked that library, every single binary would need to be updated, and people told not to use anything else but the very latest.

    You static link if you don't have dependency resolution (e.g. Windows) because getting people to install ten vaguely related libraries to view a movie file is stupid. You dynamically link whenever you otherwise can (e.g. Linux).

    The arguments for static linking are really spurious at best, and the cause of a lot of security problems.

    • (Score: 2) by DannyB on Thursday July 25 2019, @07:49PM (1 child)

      by DannyB (5839) Subscriber Badge on Thursday July 25 2019, @07:49PM (#871228) Journal

      I agree and understand that. Still there are motivations sometimes. Waaaaaah! (sniff) Waaaaaaaaaaaaaaah!!! I want the latest GIMP on my pixelbook but the new debian won't be out for another three weeks!

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 2) by coolgopher on Friday July 26 2019, @05:50AM

        by coolgopher (1157) on Friday July 26 2019, @05:50AM (#871342)

        Use the source, Luke...erm, Danny.

    • (Score: 0) by Anonymous Coward on Friday July 26 2019, @01:05AM

      by Anonymous Coward on Friday July 26 2019, @01:05AM (#871304)

      Well to allay the fears and screaming from the looneybin assylym known as UserLand, VLC was in the updates on my distro. Storm has passed, calm down and enjoy the teacup again.

    • (Score: 2) by darkfeline on Saturday July 27 2019, @08:37AM (1 child)

      by darkfeline (1030) on Saturday July 27 2019, @08:37AM (#871815) Homepage

      >every single binary would need to be updated

      You make that sound like it's the end of the world. There was a time when binary compatibility was non-existent and you had to compile everything anyway (that's why C is called portable assembly and why POSIX/SUS became a thing).

      Even now, when a half dozen binaries can run on any computer you could reasonable land grubby hands on, binaries get built and distributed to users ALL THE TIME, through package managers and auto updates.

      God forbid we have to recompile and redistribute a binary in 2019, whatever shall we do.

      >people told not to use anything else but the very latest

      That's already true of a lot of software, dynamic linking or not.

      >The arguments for static linking are really spurious at best, and the cause of a lot of security problems.

      Let me introduce you to something called dynamic linking privilege escalation. https://www.boiteaklou.fr/Abusing-Shared-Libraries.html [boiteaklou.fr]

      Seriously though, here's a neat security trick you can do with statically linked programs: it's called sandboxing.

      You could of course pack all of the libraries along with your dynamically linked program and call it, uh, AppImage, let's say. Congratulations, you just invented static linking.

      --
      Join the SDF Public Access UNIX System today!
      • (Score: 2) by ledow on Sunday July 28 2019, @07:43AM

        by ledow (5567) on Sunday July 28 2019, @07:43AM (#872228) Homepage

        You expanded the scope without realising.

        If VLC has a flaw, then with dynamic and static linking you must update VLC.
        If - as in this case - some linked library has a flaw, then with dynamic linking you update the library, once. With static linking you not only have to update the library, but anything that ever linked it in, which may well be almost impossible to determine.

        Take, for example, OpenSSL. If there's a flaw, you update the package and usgo about your life. But have you seen how much stuff, on all kinds of platforms, including platforms that do not have dependency resolution, statically links it and which, sometimes, statically links it and doesn't say? Try just about every program in Program Files on Windows for example.

        Additionally, on such platforms, dynamic linking can be inherently broken by design (DLL Hell), effectively giving every program that ships with an OpenSSL DLL in its program folder a "static" copy of it that does *not* get updated when other programs update it. In my program files folders there are no less than 31 copies of ssleay.dll - god knows how many of the other 50-odd files containing "ssl" actually are the same library under a different name, and who on Earth knows how many have been statically compiled. Updating any one does not update the others.

        But, hey, at least they're not used for anything critical like, say, OpenVPN for Windows, right?

        Now those are the ones you can *see*. The ones you *can't* see might well be entirely invisible to any search (e.g. in a packed executable like ASPack or UPX for example). There are some 300 files that contain strings suspiciously similar to those only found in the OpenSSL library itself (i.e. not things that programs that only use OpenSSL would naturally contain).

        That's now 300+ separate files that - assuming all their developers are on the ball, still developing, know that the software has a flaw and that their own program uses that code internally, pushing updates religiously, and for which you have some kind of update mechanism (e.g. when was the last time you auto-updated Inkscape, or OpenVPN for Windows?) - will have to be updated for every single flaw in OpenSSL.

        Dynamic linking solves the problem. One update, at an OS level, and you're done.

        Windows is basically using static linking, because the dynamic linking for anything other than MSVCRT DLLs is woefully pitiful and useless (and hence why developers static-link or even dynamic-link and then bundle the MSVCRT installer on first install - which is often just as outdated as anything else!). Developers bundle *everything* with the program because you have no sensible way to determine if the library is already installed, what happens if that library is removed (e.g. installing app A installs library L, installing app B sees library L already so doesn't install it. User removes app A which removes library L. App B now stops working!), or to get the OS/user to make sure that everything is in place.

        On any other OS, there's literally no excuse... nobody bundles OpenSSL with their program. They rely on it being there, and ask for it to be installed if it's not. One copy, in one well-defined location, with one place to update.

        Static linking is the equivalent of updating your heavily-linked spreadsheets using a chalk and a board eraser. When this changes, we need to update this... and this... and this... and this... Oops! We forgot that one! Full system compromise!

        Your example is literally a terrible-security problem - it requires a "setuid" executable, and a piss-poor security setup that lets a user override LD paths and direct them to a user-writable area. It's literally the epitome of "give up your system admin job now". P.S. Windows has a ton more problems, e.g. it will use an existing in-memory copy of a DLL in preference to what a program tries to load, that have nothing to do with "using dynamic libraries" but instead "totally cocking up the implementation by design because we literally couldn't be bothered to fix the DLL Hell of Windows 3.1 in over 20 years".

        Ignoring terribly configured operating system setups... I'd rather be doing "apt-get update" than chasing down every program on even a single laptop (let alone an entire network) that might potentially, silently, invisibly be using a library that we know is totally insecure and compromised. Hell, I'd rather just stop using those programs entirely than the latter, in fact.

        Static linking should never have been allowed on any OS (but, as you point out, and as Window's use of DLLs shows, there are an almost infinite number of ways to effectively perform static linking without explicit compiler support to do so).

        This is not an "imaginary" scenario. OpenSSL is pretty much dying because of its insecurity and unmaintainability. Even zlib has had major problems over the years. There are no end of common libraries deployed over every major OS, with known and serious flaws, that require updating - and which are still being shipped inside static-linked binaries (as paid products, no less) and with bundled DLLs which are effectively the same thing.

        Updating every single one of those, every time there's a problem, is not far from "the end of the world" that I describe it as.

        Dynamic linking.
        No library bundling.
        Centralised OS update mechanism.

        There's a reason that certain OS were designed that way several decades ago.

        Don't even get me started on things like "SDL2.DLL"... libraries contain versioning information for a reason.

  • (Score: 2) by c0lo on Friday July 26 2019, @05:16AM (3 children)

    by c0lo (156) Subscriber Badge on Friday July 26 2019, @05:16AM (#871337) Journal

    Wait... I know this one!
    Just rewrite VLC in JavaScript and execute it in browser. No static or dynamic linking, the app is always updated if newer than browser cache versions are found in the cloud. See? No problems.

    (large grin)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford