Slash Boxes

SoylentNews is people

posted by hubie on Friday March 10, @05:29AM   Printer-friendly

There's never enough time or staff to scan code repositories:

Software dependencies, or a piece of software that an application requires to function, are notoriously difficult to manage and constitute a major software supply chain risk. If you're not aware of what's in your software supply chain, an upstream vulnerability in one of your dependencies can be fatal.

A simple React-based Web application can have upward of 1,700 transitive NodeJS "npm" dependencies, and after a few months "npm audit" will reveal that a relatively large number of those dependencies have security vulnerabilities. The case is similar for Python, Rust, and every other programming language with a package manager.

I like to think of dependencies as decaying fruit in the unrefrigerated section of the code grocer, especially npm packages, which are often written by unpaid developers who have little motivation to put in more than the bare minimum of effort. They're often written for personal use and they're open sourced by chance, not by choice. They're not written to last.

[...] Not all hope is lost. For known (reported and accepted) vulnerabilities, tools exist, such as pip-audit, which scans a developer's Python working environment for vulnerabilities. Npm-audit does the same for nodeJS packages. Similar tools exist for every major programming language and, in fact, Google recently released OSV-Scanner, which attempts to be a Swiss Army knife for software dependency vulnerabilities. Whether developers are encouraged (or forced) to run these audits regularly is beyond the scope of this analysis, as is whether they actually take action to remediate these known vulnerabilities.

However, luckily for all of us, automated CI/CD tools like Dependabot exist to make these fixes as painless as possible. These tools will continually scan your code repositories for out-of-date packages and automatically submit a pull request (PR) to fix them. Searching for "dependabot[bot]" or "renovate[bot]" on GitHub and filtering to active PRs yields millions of results! However, 3 million dependency fixes versus hundreds of millions of active PRs at any given time is an impossible quantification to attempt to make outside of an in-depth analysis.

[...] Did you install your packages from the command line? If so, did you type them in properly? Now that you've installed your dependencies "correctly," did you verify that the code for each dependency does exactly what you think it does? Did you verify that each dependency was installed from the expected package repository? Did you ....

Probably not, and that's OK! It's inhumane to expect developers to do this for every single dependency. The best bet for software developers, software companies, and even individual tinkerers is to have some form of runtime protection/detection. Luckily for us all, there are detection and response tools that have relatively recently been created which are now part of a healthy and competitive ecosystem! Many of them, like Falco, Sysdig Open Source, and Osquery, even have free and open source components. Most even come with a default set of rules/protections.

Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by deimios on Friday March 10, @11:53AM (1 child)

    by deimios (201) Subscriber Badge on Friday March 10, @11:53AM (#1295482) Journal

    "especially npm packages, which are often written by unpaid developers who have little motivation to put in more than the bare minimum of effort" - Didn't I hear this from Microsoft back when they openly hated Open Source?

    Alas, if you want to get somewhere you won't reinvent the wheel every time. You just create a dependency on some high grade perfected wheels.

    I am not talking about meme grade packages like leftpad, isOdd, isEven etc.. , but rolling your own ORM is more trouble than it's worth.

    Unless you're payed by the hour and they explicitly state no external dependencies, then go ahead and rebuild the universe atom by atom.

    • (Score: 3, Insightful) by theluggage on Friday March 10, @04:44PM

      by theluggage (1797) on Friday March 10, @04:44PM (#1295523)

      Alas, if you want to get somewhere you won't reinvent the wheel every time. You just create a dependency on some high grade perfected wheels.

      I think the problem is that developers don't need languages they need platforms with all the "wheels" you need to develop real-world applications. Modern languages like Node.js*, python (and possibly Rust, but I can't speak from experience on that) have been built on a "rock soup" basis and depend on third-party dependencies to be useful (e.g. Node.js is no use as a website backend without adding things like Express and jsdom...) Even things like 'leftpad' are somewhat forgivable when the core language only has very limited string formatting functions.

      C.f. something like C which - on non-unix systems - usually came bundled with a substantial subset of the UNIX standard libraries (which, post-ANSI/K&R edition 2 effectively became part of the language) which was pretty much all you needed for writing text-mode software. Pascal by itself was as much use as an ashtray on a motorbike but the successful implementations like VAX Pascal, Turbo Pascal, Delphi etc. came with a lot more (up to and including GUIs and Application Frameworks for Delphi). Visual C++ came with MS Foundation Classes and ODBC libraries. Java: a complete cross-platform GUI class library. PHP - fucking awful language - but the standard distribution included a shedload of libraries for HTML/XML parsing, database access: everything you needed for writing web backends. I remember wanting to write a program to parse some XML files (I couldn't help that they were in XML) and wondering if this was an opportunity to try Python. Looked for Python XML support and found a couple of half-finished XML libraries and a lot of online beard-pulling about the most "pythonesque" way of supporting XML - PHP came with an official build of Apache's XML libraries in the standard distribution (this was years ago, but not before python was well established - things may have improved now ).

      On the one hand, the "rock soup" method probably helps ensure that we can get full-featured development tools for free, and helps stoping Microsoft from pulling an Embrace, Extend and Extinguish on competitors to Visual C++. It's easy to forget that in the "good old days" you had to pay for your developer tools. On the other hand, this reliance on third-party dependencies (which have dependencies on their backs to bite 'em) for much needed functionality is really getting out of hand (esp. NPM) - and we could do with some well-curated, stable, 'standard' packages to accompany these "new" languages.

      (* OK that's an ECMAScript implementation rather than a language in itself, but regular ECMA/Javascript traditionally comes wrapped in a web browser, so the "platf" is usually DOM/HTML).

      (Kids, get off my lawn and take your modern "fragile development" and "minimum viable product" rubbish with you! :-) )

  • (Score: 2, Funny) by shrewdsheep on Friday March 10, @12:29PM

    by shrewdsheep (5215) on Friday March 10, @12:29PM (#1295487)

    I agree with the analysis that there are risks. I am unsure in how much the tools mentioned can mitigate the problem. I believe one strategy to add security is similar to OS choice. Linux is a less attractive target, so it is more secure. Similarly, use C or C++ for the critical stuff, as it has a much smaller audience than the scripting languages and will be targeted to a lesser degree. Plus, the available libraries often have commercial backing which might imply code review or even audit.

  • (Score: 3, Insightful) by Snospar on Friday March 10, @12:35PM

    by Snospar (5366) Subscriber Badge on Friday March 10, @12:35PM (#1295488)

    You may have a problem with some of your installed modules, here install this other module to check. Safe? Sure, it's on github and in the repositories. What could possibly go wrong?

  • (Score: 4, Interesting) by Rosco P. Coltrane on Friday March 10, @04:02PM

    by Rosco P. Coltrane (4757) on Friday March 10, @04:02PM (#1295512)

    I code in Python for a living. The stuff I code always import the barest minimum of modules, especially if they're not part of the core Python distribution.

    I only import stuff that really make life easier. For instance, the ever-useful pySerial. Obviously I'm not gonna recode cross-platform serial routines.

    I tend to import wildly used modules, or ultra-specialized modules distributed but companies themselves. For example, Python bindings for Basler cameras distributed by Basler [], or Python modules to talk to LabJack DAO devices distribyted by LabJack []

    For obscure modules that have added value but that I don't quite trust, I download them, review them and then only use the reviewed copy from our intranet. For example, the PyGPD3303S module [] to talk to GPD3303S USB power supples, or the Prologix GPIB-to-Ethernet Python wrapper [] to talk to Prologix GPIB ethernet adapters: those two modules are obviously made by two nice dudes on their spare time, but I don't want to find out the hard way one day that their repos has been taken over by hackers.

    Finally, for really simple stuff, I just recode the functionality myself. For example, all of our networked products use a very small subset of IPv4. I'm not going to import the ipaddress module just to do a few simple calculations to figure out a broadcast address or a network address. I just made my own ultra-simple, ultra-limited IPv4 class that suits our needs, doesn't require any external import and loads faster. It's a 20-liner and I know exactly what it does.

    Do that and you won't end up with "1,700 transitive NodeJS". I suspect those who depend on other people's work to that extent are code monkeys who are too lazy to implement simple things, or too dumb to do them themselves.

  • (Score: 4, Insightful) by Mojibake Tengu on Friday March 10, @04:55PM (4 children)

    by Mojibake Tengu (8598) on Friday March 10, @04:55PM (#1295525) Journal

    Well, it's quite simple: Every time you use a foreign library or module for a critical project, you shall fork it and keep your own repository at hand, on your own infrastructure.
    If you are corporation, assign an employee to maintain it and interact with upstream.

    Doing otherwise is crazy unprofessional, for essentially you implicitly trust people you don't know and, most importantly, do not control.
    Added costs are necessary. Every street bar out there has a brute bouncer, and it's for safety, not for parade. Hard experience of many generations.

    Code responsibly. With epic skills comes epic responsibility.

    The edge of 太玄 cannot be defined, for it is beyond every aspect of design
    • (Score: 3, Interesting) by guest reader on Friday March 10, @06:34PM

      by guest reader (26132) Subscriber Badge on Friday March 10, @06:34PM (#1295571)

      Exactly. We copy the source code of each foreign library into 3rdparty directory of our project. The licenses are compatible. We then either use header only versions or build these 3rdparty libraries from our build environment. This approach proved to be very useful particularly with Boost library. The minimum effort is to at least store the last known working version of of each foreign library into our 3rdparty directory as a tarball. We are doing this for more than 10 years.

    • (Score: 2) by turgid on Friday March 10, @06:35PM

      by turgid (4318) Subscriber Badge on Friday March 10, @06:35PM (#1295572) Journal

      Indeed. Take responsibility for the code you are writing and delivering. Own it, requirements, quality, testing, documentation, security and all.

    • (Score: 3, Interesting) by istartedi on Friday March 10, @11:43PM (1 child)

      by istartedi (123) on Friday March 10, @11:43PM (#1295616) Journal

      I'm pretty sure it was like that at every professional project I worked on. The only thing to add is to make sure that you really can restore from backup. I know that was an exercise I saw at least once--pull backup tape, image raw box, build entire product. No connection to an external repository required. "A company you can hold in your hand", that's how it should be.

      Merging in public changes is a PiTA, a busy-work chore, but the people responsible for it didn't seem to mind too much. There was usually enough "dev" in their "ops" to keep it interesting.

      Appended to the end of comments you post. Max: 120 chars.
  • (Score: 2) by krishnoid on Friday March 10, @07:16PM

    by krishnoid (1156) on Friday March 10, @07:16PM (#1295577)

    Probably not, and that's OK! It's inhumane to expect developers to do this for every single dependency. The best bet for software developers, software companies, and even individual tinkerers is to have some form of runtime protection/detection.

    When you go the Debian route, the packages are all (to my understanding) run through a gauntlet, and anything that doesn't make it through, never shows up in the release. That way only those packages that make it through an inhumane set of interoperability checks are the ones you build your application on top of. But that's ok, because man's inhumanity to software (from a static analysis perspective, anyway) isn't unethical, really.

  • (Score: 2) by PinkyGigglebrain on Friday March 10, @08:33PM

    by PinkyGigglebrain (4458) on Friday March 10, @08:33PM (#1295590)

    Depenancies []

    "Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."