Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday July 03 2020, @10:41AM   Printer-friendly
from the friend-of-a-friend dept.

More than 75% of all vulnerabilities reside in indirect dependencies:

The vast majority of security vulnerabilities in open-source projects reside in indirect dependencies rather than directly and first-hand loaded components.

"Aggregating the numbers from all ecosystems, we found more than three times as many vulnerabilities in indirect dependencies than we did direct dependencies," Alyssa Miller, Application Security Advocate at Snyk, told ZDNet in an interview discussing Snyk's State of Open Source Security for 2020 study.

The report looked at how vulnerabilities impacted the JavaScript (npm), Ruby (RubyGems), Java (MavenCentral), PHP (Packagist), and Python (PyPI) ecosystems.

Snyk said that 86% of the JavaScript security bugs, 81% of the Ruby bugs, and 74% of the Java ones impacted libraries that were dependencies of the primary components loaded inside a project.

[...] Snyk argues that companies scanning their primary dependencies for security issues without exploring their full dependency tree multiple levels down would release or end up running products that were vulnerable to unforeseen bugs.

So dear Soylentils, how do you track vulnerabilities in libraries that you use in your projects and do you scan beyond direct dependencies?

Previously:
(2020-05-16) Nine in Ten Biz Applications Harbor Out-of-Date, Unsupported, Insecure Open-Source Code, Study Shows


Original Submission

Related Stories

Nine in Ten Biz Applications Harbor Out-of-Date, Unsupported, Insecure Open-Source Code, Study Shows 16 comments

Nine in ten biz applications harbor out-of-date, unsupported, insecure open-source code, study shows:

Ninety-one per cent of commercial applications include outdated or abandoned open source components, underscoring the potential vulnerability of organizations using untended code, according to a software review.

Synopsys, a California-based design automation biz, conducted an audit of 1,253 commercial codebases in 17 industries for its 2020 Open Source Security and Risk Analysis report.

It found that almost all (99 per cent) of the codebases examined have at least one open source component and that 70 per cent of the code overall is open source. That's about twice as much as the company's 2015 report, which found only 36 per cent of audited code was open source.

Good news then, open source code has become more important to organizations, but its risks have followed, exemplified by vulnerabilities like the 2014 Heartbleed memory disclosure bug and Apache Struts flaws identified in 2017 and 2018.

Ninety-one percent of the audited applications had components that are either four years out of date or have exhibited no active development for two years. In 2019 – the time-period covered by the 2020 report – the percentage of codebases containing vulnerable components rose to 75 per cent, up from 60 per cent in 2018.

The percentage of applications afflicted with high-risk flaws reached 49 per cent in 2019, up from 40 per cent in 2018.

[Ed Note - The company that produced this report, Synopsis, is a vendor in this space and is not a disinterested party.]


Original Submission

Backdoor in Public Repository Used New Form of Attack to Target Big Firms 11 comments

Backdoor in public repository used new form of attack to target big firms:

A backdoor that researchers found hiding inside open source code targeting four German companies was the work of a professional penetration tester. The tester was checking clients' resilience against a new class of attacks that exploit public repositories used by millions of software projects worldwide. But it could have been bad. Very bad.

[...] A few weeks later, a different researcher uncovered evidence that showed that Amazon, Slack, Lyft, Zillow, and other companies had been targeted in attacks that used the same technique. The release of more than 200 malicious packages into the wild indicated the attack Birsan devised appealed to real-world threat actors.

Dependency confusion exploits companies' reliance on open source code available from repositories such as NPM, PyPI, or RubyGems. In some cases, the company software will automatically connect to these sources to retrieve the code libraries required for the application to function. Other times, developers store these so-called dependencies internally. As the name suggests, dependency confusion works by tricking a target into downloading the library from the wrong place—a public source rather than an internal one.

To pull this off, hackers scour JavaScript code, accidentally published internal packages, and other sources to discover the names of internally stored code dependencies by the targeted organization. The hackers then create a malicious dependency and host it on one of the public repositories. By giving the malicious package the same name as the internal one and using a higher version number, some targets will automatically download it and update the software. With that, the hackers have succeeded in infecting the software supply chain the targets rely on and getting the target or its users to run malicious code.

Previously:
Open-Source Security: It's Too Easy to Upload 'Devastating' Malicious Packages, Warns Google
Dependency Yanked Over Licensing Mishap Breaks Rails Worldwide
More Than 75% of All Vulnerabilities Reside in Indirect Dependencies


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by MostCynical on Friday July 03 2020, @10:54AM (19 children)

    by MostCynical (2589) on Friday July 03 2020, @10:54AM (#1015709) Journal

    the only thing that has to be tracked is ensuring the contract has a clause covering third-party inclusion liability exemption.

    For 99% of commercial projects, you are now done.

     

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    • (Score: 5, Insightful) by bradley13 on Friday July 03 2020, @11:28AM (18 children)

      by bradley13 (3053) on Friday July 03 2020, @11:28AM (#1015711) Homepage Journal

      A liability exclusion is great and all, but it doesn't help your hacked customers, nor does it help your reputation.

      Better to just avoid the problem by not using external libraries, if you can possibly avoid them.

      --
      Everyone is somebody else's weirdo.
      • (Score: 2) by RS3 on Friday July 03 2020, @02:59PM (14 children)

        by RS3 (6367) on Friday July 03 2020, @02:59PM (#1015762)

        Absolutely agree. This is a difficult problem. Library trees are great for fast top-level code development and lots of great functionality, but my fear: since they're mostly used as run-time scripts, even if they're fully debugged and clean today, tomorrow someone might make a change, inadvertently (but negligently) adding bugs back in. And FTFA, often it's malicious evildoers.

        Somewhere I read (thought it was in the linked article but now I can't find it) that it's better to write your own code. I was also thinking, if licensing allows, you could just download the libraries, clip out the functions you need, and host them yourself, rather than rely on 3rd-party hosted code. That'll reduce your attackable surface.

        I'm so relieved that perl isn't on the list. :)

        • (Score: 2) by JoeMerchant on Friday July 03 2020, @04:05PM (13 children)

          by JoeMerchant (3937) on Friday July 03 2020, @04:05PM (#1015778)

          We develop with "latest" then lock down our system image before validation. It's a bummer when our locked down image ends up needing updating, and total revalidation, but... we don't have a lot of network exposed code - yet, that's coming in the near future and I wonder how our validation model will fare.

          --
          🌻🌻 [google.com]
          • (Score: 2) by RS3 on Friday July 03 2020, @05:09PM (12 children)

            by RS3 (6367) on Friday July 03 2020, @05:09PM (#1015788)

            By "network exposed code", do you mean pulling in 3rd-party libraries? If so, show mgt. TFA.

            And again, my advice would be to use libraries that you can copy in.

            Or code your own, but that gets into the big controversy over "are APIs patentable / copyrightable"??

            • (Score: 2) by JoeMerchant on Friday July 03 2020, @05:40PM (11 children)

              by JoeMerchant (3937) on Friday July 03 2020, @05:40PM (#1015801)

              Management is pretty dialed in, they typically know about vulnerabilities that show up here a few weeks or more before the story drops on Soylent.

              The real battle is: why do we need it? I saw a "meme" the other day that really fits well:

              10 My dishwasher failed to update.

              20 Why did your dishwasher fail to update?

              30 Because it couldn't connect over the internet.

              40 Why does your dishwasher need to update?

              50 To download security patches.

              60 Why does your dishwasher need security patches?

              70 Because it's on the internet.

              80 GOTO 20

              --
              🌻🌻 [google.com]
              • (Score: 2) by RS3 on Friday July 03 2020, @06:26PM (2 children)

                by RS3 (6367) on Friday July 03 2020, @06:26PM (#1015816)

                And to remote brick dishwasher if A) "unauthorized repair" or B) now deprecated / "unsupported".

                • (Score: 2) by fyngyrz on Saturday July 04 2020, @12:35PM (1 child)

                  by fyngyrz (6567) on Saturday July 04 2020, @12:35PM (#1016079) Journal

                  ...after selling your personal information to advertisers, and exposing it to black hats.

                  In the meantime, every time the Internet connection is lost, it refuses to wash your dishes.

                  --
                  Every glass of beer is a tragic story of grains
                  that could have become pizza crust, but didn't.

                  • (Score: 2) by RS3 on Saturday July 04 2020, @02:16PM

                    by RS3 (6367) on Saturday July 04 2020, @02:16PM (#1016107)

                    Actually they already had all of that and knew you were going to buy that dishwasher because it's all part of a mind-control system that has encompassed everything and you wouldn't have been able to not buy that dishwasher.

              • (Score: 0) by Anonymous Coward on Friday July 03 2020, @07:26PM (7 children)

                by Anonymous Coward on Friday July 03 2020, @07:26PM (#1015827)

                Where do they get their info on vulns?

                • (Score: 2) by JoeMerchant on Friday July 03 2020, @08:40PM (5 children)

                  by JoeMerchant (3937) on Friday July 03 2020, @08:40PM (#1015843)

                  I believe it's the committee on vulnerability awareness... a loose collection of people who make it their business to know these things and disseminate the information throughout the organization. For committee members to willingly reveal their sources would require a massive shift in their job security posturing.

                  --
                  🌻🌻 [google.com]
                  • (Score: 0) by Anonymous Coward on Friday July 03 2020, @09:41PM

                    by Anonymous Coward on Friday July 03 2020, @09:41PM (#1015864)

                    Fascinating, and too bad. Thanks for the reply; honestly appreciate it.

                  • (Score: 0) by Anonymous Coward on Saturday July 04 2020, @03:31AM (3 children)

                    by Anonymous Coward on Saturday July 04 2020, @03:31AM (#1015991)

                    There are networks of people who are privy to all sorts of things before the general public. For example, an acquaintance of mine is on a couple of the major invitation-only Linux and distro security lists and he has said that vulnerabilities will show up on that list months and sometimes years before they are patched or publicly acknowledged. I have heard similar stories from our IT department about the major vendors we use as well.

                    • (Score: 2) by RS3 on Saturday July 04 2020, @04:26AM (2 children)

                      by RS3 (6367) on Saturday July 04 2020, @04:26AM (#1016005)

                      Yeah, have read that fairly often here, green site, threatpost, etc. Why don't things get patched immediately? I was going to say "especially if it's open-source" but open or closed- patching should be really fast.

                      • (Score: 0) by Anonymous Coward on Saturday July 04 2020, @07:08AM (1 child)

                        by Anonymous Coward on Saturday July 04 2020, @07:08AM (#1016036)

                        Patching and testing can be hard. You have to simultaneously fix the bug behavior while simultaneously preserving the expected behavior as much as possible. In addition, the bug can be caused by the way functions interact or round trips or library interaction or how the entire environment interacts, making it hard to track down where it actually is in the code. And, on more than a few occasions, it turns out that someone was relying on the bug behavior the whole time, non-maliciously, and didn't even realize it. This makes your test and integration suites break, which you then have to fix on top of the environment. And then, they don't want to patch it in one area and leave a bunch of other users of the software unable to update, so you have to wait for groups to be ready, despite their various release processes, to update all at the same time. Then there is also the normal human behavior of people not seeing the problem, having other priorities, dragging of feet in general, and the internal politics.

                        • (Score: 2) by RS3 on Saturday July 04 2020, @02:41PM

                          by RS3 (6367) on Saturday July 04 2020, @02:41PM (#1016119)

                          In other words, so much is in place now, and it's so integrated into society, that it's too big to allow it to fail.

                          You'd make a great defense lawyer for the MBAs.

                          There will always be a human error factor, but this problem is all driven by greed and cost-cutting. I have no problem dealing with a little pain for a much better long-term gain. I've seen far far more bugs and patches and updates and breaches in commercial software as I have in open-source, including of course Linux, GNU apps and other projects like LibreOffice, KiCAD, etc. They're written by people who are driven by the goal of something that works well and they can take some personal pride and sense of accomplishment, besides filling a personal need.

                          That's also true in commercial software development, but it's all driven by an overall profit motive. Ship it now, we'll fix it later... And that's not philosophical- it's been the driving force in most of my career, including pure hardware stuff. Only very recently I'm doing some part-time work in a field where quality is more important than costs or deadlines, and I hate to admit but sometimes I struggle to adjust. I find I still try to find the cheapest way to do something, or the cheapest parts available, and then the customer's extreme QC rejects it.

                          I feel like I've just written what's been obvious for at least 30 years. I say it's been allowed to go too far. Some of the recent stories like the critical bugs in IoT TCP/IP stack libraries that are almost ubiquitous (in IoT) should wake someone up. I'm not sure how to fix the underlying problem. It'd be great to be part of a mod-system-less discussion group / thinktank.

                • (Score: 2) by RS3 on Saturday July 04 2020, @04:30AM

                  by RS3 (6367) on Saturday July 04 2020, @04:30AM (#1016006)
      • (Score: 2) by legont on Friday July 03 2020, @06:04PM (2 children)

        by legont (4179) on Friday July 03 2020, @06:04PM (#1015810)

        Because of this all open source is being weeded out from my very big financial employer.

        However, it is not all. There are security regulations and my boss has a monthly chat with Federal Reserve about them. Whatever they say we do. The rest we can't care - it's too much already - and we don't.

        --
        "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
        • (Score: 2) by canopic jug on Friday July 03 2020, @06:45PM (1 child)

          by canopic jug (3949) Subscriber Badge on Friday July 03 2020, @06:45PM (#1015819) Journal

          Because of this all open source is being weeded out from my very big financial employer.

          Their main goal is fighting Copyleft and that's because Snyk and Blackduck [techrights.org] have close ties to M$ and exist primarily for the purpose of sowing fear, uncertainty, and doubt against free and open source software. The managers there were probably already fans of Bill and were just looking for an excuse to downgrade.

          --
          Money is not free speech. Elections should not be auctions.
          • (Score: 0, Troll) by Anonymous Coward on Friday July 03 2020, @06:59PM

            by Anonymous Coward on Friday July 03 2020, @06:59PM (#1015822)

            Their main goal is fighting Copyleft and that's because Snyk and Blackduck [techrights.org] have close ties to M$

            1990s called, they want their reality back.

            It's 2020. Microsoft is one of the largest OSS developers in the world. Their entire profit drive is through Azure which relies on OSS.

            https://azure.microsoft.com/en-gb/blog/expanding-linux-and-oss-support-on-azure/ [microsoft.com]

  • (Score: 3, Insightful) by bradley13 on Friday July 03 2020, @11:27AM (26 children)

    by bradley13 (3053) on Friday July 03 2020, @11:27AM (#1015710) Homepage Journal

    Don't use external libraries. Seriously, avoid them whenever possible.

    For one application, I used little, stand-alone library published by Google. Then came an update, and the little library pulled in a big library, which pulled in 5 or 6 more dependencies. Who knows what all that code did, who knows what security holes it had, and anyway: bloat. There was only one possible answer to this: throw out the library and write my own code for the needed functionality.

    Unless a library or framework is doing something you cannot reasonably replicate, avoid it. Write your own solution. It will be smaller, faster, and you will know that it does exactly and only what you need it to do.

    Examples of libraries you should use: anything built into the language, encryption libraries, front-end frameworks (JavaFX, React, etc.), database connectors.

    Examples of libraries you should avoid: Pretty much everything else

    --
    Everyone is somebody else's weirdo.
    • (Score: 2, Insightful) by Anonymous Coward on Friday July 03 2020, @11:45AM (17 children)

      by Anonymous Coward on Friday July 03 2020, @11:45AM (#1015713)

      "Write your own solution."

      That is a two edged sword.
          The good news is that yours is smaller so less chance for bugs.
          The bad news is only one set of eyes looking for the bugs.

      Maybe a better solution is to encourage small, common, well tested libraries.
          That would provide your answer's benefit (low attack surface) but mine also (many eyes looking for bugs).

      So how would one encourage such a thing?
          Perhaps make it easy to rate how much cruft a web page takes when it loads?

      • (Score: 1, Insightful) by Anonymous Coward on Friday July 03 2020, @11:54AM

        by Anonymous Coward on Friday July 03 2020, @11:54AM (#1015715)

        As always, a mixture of opinions is ideal. Everyone writing their own means slower development, less eyes, less maintainability. Everyone collaborating on one project brings a monoculture and a single point of failure.

      • (Score: 3, Insightful) by HiThere on Friday July 03 2020, @01:50PM (14 children)

        by HiThere (866) Subscriber Badge on Friday July 03 2020, @01:50PM (#1015748) Journal

        The real problem is "dynamic linking" where the particular library that you are using can be updated after you write the code. No language that does that can safely use external libraries. This is one of my problems with go and rust...those languages are designed to be dependent on the net. If you're going to use an external library it should at least match a checksum.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 2) by RS3 on Friday July 03 2020, @03:02PM

          by RS3 (6367) on Friday July 03 2020, @03:02PM (#1015763)

          Yes, exactly, and I commented on this above.

          My thought was to download the needed library code and host it yourself, if licensing allows.

        • (Score: 3, Informative) by The Mighty Buzzard on Friday July 03 2020, @03:14PM (9 children)

          Rust does not update library versions for a project unless you foolishly use wildcards in your library version numbers. And even then, once you've built it once on that box you have to explicitly tell it to update the libraries every time you want to. This still doesn't protect you from any system-wide shared libraries being updated by the operating system though.

          --
          My rights don't end where your fear begins.
          • (Score: 2) by HiThere on Friday July 03 2020, @10:48PM (8 children)

            by HiThere (866) Subscriber Badge on Friday July 03 2020, @10:48PM (#1015888) Journal

            OK, I'm not rust expert, and that's not the way I understood cargo as working. The objection still stands in a weaker form, but in that form it applies to all systems libraries. If your system needs to be secure you should include all the libraries you need statically linked. This can get bulky.

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
            • (Score: 2) by The Mighty Buzzard on Saturday July 04 2020, @04:13AM

              Their library handling is actually pretty sweet even if the libraries themselves are extremely unpolished and constantly breaking compatibility in the quest to use the shiny new coding thing. Have a look into it if you're actually interested.

              --
              My rights don't end where your fear begins.
            • (Score: 2) by fyngyrz on Saturday July 04 2020, @01:36PM (6 children)

              by fyngyrz (6567) on Saturday July 04 2020, @01:36PM (#1016091) Journal

              If your system needs to be secure you should include all the libraries you need statically linked. This can get bulky.

              Static linking, absolutely.

              Bulk: RAM is cheap. Mass storage is cheap. CPUs can address very large amounts of RAM and easily access significant amounts of storage across multiple, very large storage devices. Even very large software can be loaded from mass storage very fast now.

              The time to worry about executable size for desktop machines (and coming soonish for palmtops) is past, IMO. The remaining important goals to reach for these days are:

              • Capability — as always, the user's needs must be addressed
              • Consistency — thing A should work like thing B if at all possible
              • Documentation — For anything non-trivial, this can be critical
              • Efficient algorithm(s) — because CPUs will never be fast enough
              • Maintainability — If you can't maintain it, you probably shouldn't have written it
              • Network independence — if the network breaks, your code should not
              • Limit resistance — don't assume the user's tasks will never exceed your vision
              • Orthogonality — settings should avoid changing other settings
              • OS independence — Every unique OS feature you use is an opportunity for the OS vendor to limit the accessibility of your software
              • Ownership — "licensing" is a plague. If you don't own it, it owns you.
              • Predictability — If you can't explain why or how it works, it's likely dangerous
              • Portability — We don't live in a world with only one software platform
              • Reliability — because unreliable software ranges from annoying to dangerous
              • Security — because execution environments are typically polluted with black hats
              • Stability over releases — don't break things users have already been provided with
              • Stability over host environment "updates" — OS mutation often imposes bitrot from without
              • Usability — the less attention paid to this, the less use it will get

              OPC, with the single exception of well-written and carefully validated public domain code, tends to impede those goals one way or another. And lawyers. Lawyers pollute everything they touch. Shakespeare's Dick the Butcher [wikipedia.org] had it exactly right, if for perhaps the wrong reasons, depending on how you read the intent.

              --
                Government: Designed to provide you with "service" and...
              ...the Media: Designed to provide you with Vaseline.

              • (Score: 2) by The Mighty Buzzard on Saturday July 04 2020, @02:59PM (4 children)

                Okay, roll yourself a statically linked Linux distro. I'm curious to see how enormous it comes in at and if it'll even boot on a machine with less than 32GB of ram.

                --
                My rights don't end where your fear begins.
                • (Score: 0) by Anonymous Coward on Sunday July 05 2020, @01:46AM (3 children)

                  by Anonymous Coward on Sunday July 05 2020, @01:46AM (#1016345)

                  This made me do the math on one of our test machines. Based on the output of lsof, grepping for "\.so", stating the results and multiplying by the allocation units, the resulting disk space used by shared libraries is 256,081,920 bytes allocated on disk and 31,531,820 bytes of memory. And this is for a machine that is basically running one service and 25 processes total including the counting process, ttys, and ssh or system ones. It'd probably be interesting to see how much bigger that would be on a machine running X/Wayland or general purpose. However, the various elimination optimizations would cut down on some of the needed space.

                  • (Score: 2) by The Mighty Buzzard on Sunday July 05 2020, @10:46AM (2 children)

                    du -hs reports 698MB for /lib on this box. How much each binary would have added without shared libraries is another matter entirely though.

                    --
                    My rights don't end where your fear begins.
                    • (Score: 0) by Anonymous Coward on Monday July 06 2020, @05:12AM (1 child)

                      by Anonymous Coward on Monday July 06 2020, @05:12AM (#1016852)

                      To be clear count I provided would be the total counting all repeated libraries multiple times to somewhat simulate them being included in each static executable based on lsof output. I also understand that the optimizers would remove unused parts of the code for each executable. Your comment just made me curious as to what some sort of ballpark might be on even a stripped down system. Perhaps if I am bored one day, I'll put the static USE flag and the static* flags in the build chain on one of the systems to see what comes out. I'll then stand there staring at the screen like Oppenheimer wondering what sort of monster I have unleashed.

                      I do find it somewhat interesting that the du -hs on the box is 83MB, just under 12% of your size. And most of that is kernel modules. I'm assuming you box has X11, generic kernels, and other incidental software on it as well, which means your /usr/lib and /usr/*/lib probably add up to a lot more than the 166 MB on the test machine here.

              • (Score: 2) by Pino P on Sunday July 05 2020, @02:24AM

                by Pino P (4721) on Sunday July 05 2020, @02:24AM (#1016349) Journal

                RAM is cheap. Mass storage is cheap.

                Cellular data transfer quota to redownload the package after a security update is not cheap. Nor is mass storage cheap if it is already full and soldered to the mainboard of the paid-for appliance or phone. People are still using 8 GB Android phones where more than half the space is taken by the operating system and two copies of each application: the version that came with the operating system (for use after a factory reset) and the updated version downloaded from the device's app store. And the operating system tends to be incapable of moving a lot of applications to removable storage.

        • (Score: 2) by darkfeline on Friday July 03 2020, @11:01PM (2 children)

          by darkfeline (1030) on Friday July 03 2020, @11:01PM (#1015896) Homepage

          I find the first half of your post reasonable and the second half absurd or ignorant. Go is not only statically linked, but all dependencies are pinned via checksum and minimum specified version (rather than maximum specified version like many language dependency managers) and Go provides both global and private checksum servers and package proxies. It is the most reproducible and privately controllable language dependency manager that I know of, other than vendoring all code directly.

          --
          Join the SDF Public Access UNIX System today!
          • (Score: 2) by HiThere on Saturday July 04 2020, @02:05PM (1 child)

            by HiThere (866) Subscriber Badge on Saturday July 04 2020, @02:05PM (#1016101) Journal

            The checksum is good. Minimum specified version is not. It requires that you trust later versions without knowing anything about them. Which is it?

            Yes, I'm relatively ignorant of go. There are lots of languages out there, and I'm not expert in most of them, perhaps any of them depending on what you mean by expert. I read the go documentation and decided that code that was written was too subject to change by people supplying the libraries. Perhaps I was wrong, but a minimum specified version is *not* reassuring. Later versions being malicious is one of the things that have gotten Javascript programs in trouble. Also, I believe, Ruby. IIRC, it once even bit Debian, though they vet the changes pretty strictly. You need to be able to specify the exact version of the library AND the checksum to be secure. Any changes need the same testing as the original version. Well, that's a bit extreme, but it's close to correct. You need a good suite of tests, and changed libraries need to pass those tests without change (including without fixing any mistakes that your test answers depend on).

            There are decent arguments that all code that uses non-system libraries should be run inside of a jail, but usually need for security isn't strong enough to justify that.

            All that said, this is not an ideal world, and one must make do. But there's no reason to make things worse by increasing remote dependencies. Sometimes you must, but for that to be reasonable you need to ensure that they won't change in the future. There are clear advantages to dynamic linking, but they come at the cost of lessened security. When it's links to system libraries one can say "well, if the system is 0wn3d, then there's nothing one can do", but the same doesn't apply to remote code.

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
            • (Score: 2) by darkfeline on Saturday July 04 2020, @11:18PM

              by darkfeline (1030) on Saturday July 04 2020, @11:18PM (#1016299) Homepage

              >Minimum specified version is not. It requires that you trust later versions without knowing anything about them.

              Minimum specified version means it uses the oldest version that satisfies the requirements. It does the exact opposite of what you seem to be talking about. Most dep managers like pip will use use the latest version available if you specify >=2.1. If you specify "require 2.1", Go will use the minimum specified version, i.e., 2.1, unless either your package or another explicitly specifies a later version, in which case Go will use the minimum specified version that satisfies that requirement.

              --
              Join the SDF Public Access UNIX System today!
      • (Score: 2, Insightful) by Anonymous Coward on Friday July 03 2020, @05:39PM

        by Anonymous Coward on Friday July 03 2020, @05:39PM (#1015800)

        The bad news is only one set of eyes looking for the bugs.

        But then again, the good news is probably no set of eyes looking for the exploits. When yours is the only project in all the wide Internet, that uses a specific bit of code, it need be a pretty juicy target to recoup the time spent hacking it. With the plethora of low-hanging fruit everywhere, yours being singled out is quite unlikely without a VERY good reason.

    • (Score: 3, Informative) by driverless on Friday July 03 2020, @01:12PM

      by driverless (4770) on Friday July 03 2020, @01:12PM (#1015736)

      That's why djb writes all of his own library routines, he replaces whatever the system provides with djb-created minimalist secure ones. Downside is that you then get a slightly different djbmemcpy() and djbprintf() in every bit of code of his that you use.

    • (Score: 3, Insightful) by The Mighty Buzzard on Friday July 03 2020, @03:36PM (5 children)

      All of the above assumes you're A) Capable of writing every library you need without adding your own bugs or huge performance degradations and B) Have the time to do so. I dunno about you but I'm just not interested in reading up enough to competently write my own Vulkan, H.265, NVIDIA driver, or PCRE implementation. I don't even really care to write my own database or stdio libs. I'd much rather be writing the bits that are the actual purpose of the project in the first place.

      Which is to say, this kind of thing is fine for projects with trivial functionality, enormous manpower, or extremely constrained requirements but ludicrous for most anything else.

      And this is coming from a guy who's been writing zero-library code for industrial equipment as his latest "How little work can I do and still afford fishing gear?" project.

      --
      My rights don't end where your fear begins.
      • (Score: 2) by RS3 on Friday July 03 2020, @06:40PM (2 children)

        by RS3 (6367) on Friday July 03 2020, @06:40PM (#1015817)

        All good points. I broke the rules and read TFA, and my comments in this discussion, and maybe bradley13's too, are in context of the problem areas:

        ftfa:

        security bugs were prevalent in JavaScript, Ruby, and Java

        Java and Node.js projects, in particular, seem to leverage dependencies a lot heavier than other...

        I don't think TFA refers to drivers, or really anything else written in c or compiled languages, stdio, etc. They mentioned php and Python as having some of the problem, but very low percentage so far.

        Oddly, they made no mention of perl... :)

        • (Score: 3, Insightful) by The Mighty Buzzard on Saturday July 04 2020, @04:20AM (1 child)

          That's because Perl guys are all old and wise enough to spot someone spouting ideology instead of good sense. There are plenty of libraries that I use on a regular basis simply because I'm not willing to spend twice as long coding a single library as I do coding the core application. There's just no way I'm writing an entire IRC library from scratch every time I decide to rewrite my bot MrPlow in a new language to (re)familiarize myself with it. It's not conscientious, it's bloody stupid.

          --
          My rights don't end where your fear begins.
          • (Score: 0) by Anonymous Coward on Saturday July 04 2020, @07:23AM

            by Anonymous Coward on Saturday July 04 2020, @07:23AM (#1016039)

            One thing Perl has is thousands of insecure libraries that no one really looks at. But they also have thousands of secure packages that are well-vetted and well-used. Instead of a fractured NIH-RTW ecology, there are a good number of modules on CPAN and elsewhere that are a de facto standard library. Everyone uses them for a particular problem, they are simultaneously stable and maintained, and professionals of various beard length watch over them rather than rolling their own.

      • (Score: 2) by Common Joe on Saturday July 04 2020, @10:34AM (1 child)

        by Common Joe (33) <common.joe.0101NO@SPAMgmail.com> on Saturday July 04 2020, @10:34AM (#1016061) Journal

        I had to drop a "insightful" point upon you because you're right on the money.

        I would add that frameworks -- a key cornerstone to a lot of applications -- depend on many libraries too.

        • (Score: 2) by The Mighty Buzzard on Sunday July 05 2020, @10:58AM

          It honestly never even occurred to me not to just write the code for most things frameworks bring to the table. I mean, it's not like keystrokes are what takes up most of a coder's time and most of the shit frameworks do is stuff I could code with a wicked hangover and a bunch of noisy fuckers in the room.

          --
          My rights don't end where your fear begins.
    • (Score: 0) by Anonymous Coward on Friday July 03 2020, @06:17PM

      by Anonymous Coward on Friday July 03 2020, @06:17PM (#1015812)

      Yeah, this. It's been a while since I've looked into these things, but it seems like everybody's C library has its own way to handle strings to make up for C's issues. I bet there are hundreds of string libraries running on my machine, all accomplishing the same thing in subtly different ways.

      People blame C for that, but I think it's human nature.

      So you're going to solve this problem by rolling your own, eh?

      You know what? The developers of the library you pulled in had the same problem. They solved it by rolling their own. Now their solution is your problem.

      Aside from some dictator forcing us to use the One True Library for any given functionality, this doesn't seem like a problem that can be fixed, and I wouldn't want it to be fixed that way. The shark infested waters are where innovation happens. Some of it is just redundant; but some of it is progress. We're still in the early days of computing.

  • (Score: 0) by Anonymous Coward on Friday July 03 2020, @11:33AM (3 children)

    by Anonymous Coward on Friday July 03 2020, @11:33AM (#1015712)

    The advocate is stating something that should be trivially and painfully obvious to anybody in the field. And you can bet that it's not the low-level engineers who are the roadblock to testing the sub-sub-dependencies (well, not usually).

    But it sure is a lot of work to do it, because they're so many, and that sure costs a lot of money, so since they're just sub-sub-dependencies they cannot be important and we can skip the unimportant stuff (from result follows reason). Now where have I heard that line of reasoning before? Hint: a profession where you mostly deal with straight-line-diagrams and trivial maths.

    What will come of this article, if anything?

    An order from up high stating: "You must not use any library that has dependencies!!"

    • (Score: 2) by bradley13 on Friday July 03 2020, @12:53PM (1 child)

      by bradley13 (3053) on Friday July 03 2020, @12:53PM (#1015731) Homepage Journal

      Testing things like that is a literally endless task, because the libraries will be updated, at a schedule that you don't control.

      If you don't take the updates, you risk unpatched vulnerabilities. If you do take the updates, you will be constantly testing and re-testing. As new features are adding into the libraries, you might need to review and update your test cases as well - so even automatic testing won't entirely save you.

      Much as I love writing software, I am continually reminded of the quote: "If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization."

      --
      Everyone is somebody else's weirdo.
      • (Score: 0) by Anonymous Coward on Saturday July 04 2020, @01:55AM

        by Anonymous Coward on Saturday July 04 2020, @01:55AM (#1015956)

        "If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization."

        Or the wrong kind of virus.

    • (Score: 2) by TheRaven on Friday July 03 2020, @09:15PM

      by TheRaven (270) on Friday July 03 2020, @09:15PM (#1015857) Journal

      It's obvious because it's completely missing basic graph theory. The cause of 75% (or some similarly large number) of anything in software is going to be in indirect dependencies. If there's a bug in my program, it affects my program. If there's a bug in a library that things use directly, it affects everything that uses that library. If there's a bug in a library that is used indirectly by other libraries, it will affect anything that uses that library indirectly. A bug in any library that is used by two other libraries that are used independently will show up in more things than a bug in either of the libraries that use it.

      Equally, fixing a bug in a library that is used indirectly by thousands of programs is significantly cheaper than fixing thousands of instances of different bugs in different reimplementations of the same ideas.

      --
      sudo mod me up
  • (Score: 2) by zoward on Friday July 03 2020, @11:59AM (9 children)

    by zoward (4734) on Friday July 03 2020, @11:59AM (#1015718)

    I can't help feeling like this will get worse with snaps and flatpaks. I guess the $64 question will be: will the maintainers of all those snaps and flatpaks be better about keeping dependencies up to date than the distro itself? In some cases, yes. In many other cases .... well, I think I'll stick to system-wide libraries.

    • (Score: 2) by JoeMerchant on Friday July 03 2020, @12:03PM (5 children)

      by JoeMerchant (3937) on Friday July 03 2020, @12:03PM (#1015720)

      At least snaps and flatpacks won't bring in bugs with automatic updates of their dependencies.

      We validate that our products meet our requirements. This necessarily has the giant hole of "can't prove a negative," but when we do security / pen testing if we find a vulnerability it tends to stay in the test protocols, so at least we won't get bitten by a future regression.

      --
      🌻🌻 [google.com]
      • (Score: 2) by mmcmonster on Friday July 03 2020, @02:04PM (3 children)

        by mmcmonster (401) on Friday July 03 2020, @02:04PM (#1015752)

        I'm not into coding anymore, so please correct me if I'm incorrect:

        Can't programs require a certain version of a library?

        ie: Will work with LibME > 5.4.*, 5.5?

        Wouldn't that be better than having it install in a FlatPack 5.4.3? (So that if 5.4.4 comes out, the system can replace 5.4.3 and the application still run? If the application breaks with 5.4.4, it can change it's dependency accordingly)

        • (Score: 3, Informative) by JoeMerchant on Friday July 03 2020, @02:59PM (2 children)

          by JoeMerchant (3937) on Friday July 03 2020, @02:59PM (#1015761)

          What generally happens is the dependencies are stated like you say: 5.4 or greater. So, 5.6 comes out and has a regression in it, the dependency tree picks up 5.6 and accepts it because it's greater than 5.4, boom: your app now has the regression.

          It's all wonderful theory that newer software is better, but that's far from guaranteed.

          I'm not deep into the intricacies of apt / dpkg / etc. but I believe what the article is getting at is that, even if your app does place a dependency on libX 5.4 exactly, libX 5.4 may well have dependencies in its tree which are spec'ed like: libY 2.2 or greater, meaning that your libX 5.4 can still inherit a regression from libY 2.3 even though you've spec'ed it at 5.4 exactly.

          It's "the price" that comes along with not reinventing the wheel. Thank dog that we've gotten away from rotating hard drives - there were layers of code in their firmware that nobody on the planet remembered how they work, yet virtually every rotating hard drive on the planet used those same layers of mystery code. Now we've got "wear leveling" code in the flash drives that is 10x more complex, but at least it was developed with slightly more modern concepts of version control.

          --
          🌻🌻 [google.com]
          • (Score: 2) by zoward on Friday July 03 2020, @03:56PM (1 child)

            by zoward (4734) on Friday July 03 2020, @03:56PM (#1015776)

            These are all good points. I wonder how many actual infections are due to out of date libraries vs. up-to-date with regression bugs?

            • (Score: 3, Informative) by TheRaven on Friday July 03 2020, @09:18PM

              by TheRaven (270) on Friday July 03 2020, @09:18PM (#1015859) Journal
              Vulnerabilities are typically due to out-of-date libraries. Other kinds of bugs are more common as a result of updating but not testing the library. There was a paper at EuroS&P this year that did an analysis of upgrading the libraries bundled with a load of popular Android apps to the latest version that claimed to be backwards compatible with the version that they shipped. Something ludicrous like 30% of them actually worked (i.e. didn't crash in a tiny bit of automated UI testing) after the upgrade.
              --
              sudo mod me up
      • (Score: 0) by Anonymous Coward on Friday July 03 2020, @09:18PM

        by Anonymous Coward on Friday July 03 2020, @09:18PM (#1015858)

        Bad news, snaps aggresively autoupdate. It is causing some ruckus in Ubuntu, because they are pushing more and more software via snaps, instead of .debs. https://news.ycombinator.com/item?id=23052108 [ycombinator.com]

    • (Score: 1) by petecox on Friday July 03 2020, @12:39PM (1 child)

      by petecox (3228) on Friday July 03 2020, @12:39PM (#1015724)

      I wonder the same about desktop webapps that bundle their own web runtime.

      But with Android, Chrome OS and Edge OS (*Windows 10) now shipping Chromium with regular security updates handled by the Chrome team, perhaps frameworks such as Electron will evolve to become lighter-weight by calling a FFI to the system runtime instead of bundling their own.

      • (Score: 3, Informative) by driverless on Friday July 03 2020, @01:20PM

        by driverless (4770) on Friday July 03 2020, @01:20PM (#1015741)

        perhaps frameworks such as Electron will evolve to become lighter-weight

        Perhaps pigs will fly.
        Perhaps hell will freeze over.
        Perhaps the Cronulla Sharks will win the NRL premiership.
        Perhaps horses will grow horns.
        Frameworks always acquire more bloat. It's a natural process, like Greek/Italian women, they're created to get bigger as time goes by.

    • (Score: 2) by HiThere on Friday July 03 2020, @01:58PM

      by HiThere (866) Subscriber Badge on Friday July 03 2020, @01:58PM (#1015750) Journal

      You are assuming that the updates are improvements. Sometimes, though, they can be malicious. And malicious or not they can introduce *new* bugs, that weren't in the prior version. (As well as bloat.) Every update need to go through the same analysis as the original library to ensure that it doesn't break things. If it's not externally facing, then there is rarely a case that a working program is improved by a library change. And if the library isn't distributed with the program, the library can *become* an attack surface.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 0) by Anonymous Coward on Friday July 03 2020, @01:06PM

    by Anonymous Coward on Friday July 03 2020, @01:06PM (#1015734)

    Some of it is definitely self inflicted.

    It's certainly possible to have too many dependencies, but most projects have mostly dependencies they need. If you didn't need the dependency, there's probably no code path that invokes it, so its bugs likely don't matter. It's bloat, but not dangerous.

    Doing everything yourself is a practical impossibility, especially in the Java and Node ecosystems. You wouldn't accomplish anything anyway, you would just write your own bugs.

    Some companies have cultures or policies that make it extremely difficult to update things that need to be updated. "If you update a dependency," they say, "we have to test everything from scratch! We can't afford to do that." This is an example of management making things difficult whenever they can see them. Some companies will let you update your application dependencies, but not the operating system. This is not any better.

    The solution is simply for management to realize that a bug that happens because you didn't update things is not better than a bug that happens because you did update things. Not only are bugs more numerous in outdated code, they're also more dangerous.

    The policy needs to be "every release you update all the dependencies unless something breaks," not "you never update anything until you absolutely have to." Sometimes you will have bugs as a result, but you were going to have bugs anyway.

  • (Score: 1, Insightful) by Anonymous Coward on Friday July 03 2020, @02:12PM (4 children)

    by Anonymous Coward on Friday July 03 2020, @02:12PM (#1015756)

    Of the projects they examined, what percent of the entire code base was made up of code from indirect dependencies?

    Means something very different if 10% of the codebase contained 75% of the bugs, as opposed to 80% of the codebase contained 75% of the bugs.

    • (Score: 3, Funny) by RS3 on Friday July 03 2020, @03:05PM (3 children)

      by RS3 (6367) on Friday July 03 2020, @03:05PM (#1015764)

      Amazingly, TFA answers your questions- with pretty pictures too! :)

      • (Score: 0) by Anonymous Coward on Friday July 03 2020, @03:33PM (1 child)

        by Anonymous Coward on Friday July 03 2020, @03:33PM (#1015767)

        Cool... so, what was the answer?

        • (Score: 2) by RS3 on Friday July 03 2020, @03:52PM

          by RS3 (6367) on Friday July 03 2020, @03:52PM (#1015773)

          IDK, we'll have to ask ikanreed.

      • (Score: 0) by Anonymous Coward on Saturday July 04 2020, @04:49PM

        by Anonymous Coward on Saturday July 04 2020, @04:49PM (#1016170)

        No, it doesn't. The metric they gave is irrelevant. It's bugs from indirect dependencies vs bugs from direct dependencies - without clarifying anything about coverage. The one example they give is in the article itself which states:

        "Ask any Node developer, and they probably have a story of waiting for long periods to open a project while npm is trying to pull all the necessary dependencies," Miller added. "One of our favorite examples is an 80 line Java application that specifies 7 dependencies. When you walk the entire dependency tree, however, you find 59 sub-dependencies, and suddenly, the 80 lines of code turns into 740,000 lines.

        In which case about 99% of your bugs being in dependencies would not be anything particularly meaningful.

  • (Score: 2) by acid andy on Friday July 03 2020, @03:14PM

    by acid andy (1683) on Friday July 03 2020, @03:14PM (#1015765) Homepage Journal

    I imagine every indirect dependency of one project is going to be a direct dependency of an intermediate project, so I find their statement a bit strange.

    TFA is clearly focused on web technologies so I suppose the projects they're interested in are mainly websites and web applications. It's directed at web businesses. For API developers, I think the vulnerabilities will be in their direct dependencies, so I guess it becomes a question of who is responsible for identifying, reporting, and patching these vulnerabilities.

    --
    If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
  • (Score: 0) by Anonymous Coward on Friday July 03 2020, @03:53PM

    by Anonymous Coward on Friday July 03 2020, @03:53PM (#1015774)

    Windows

  • (Score: 3, Insightful) by Subsentient on Friday July 03 2020, @05:19PM (1 child)

    by Subsentient (1111) on Friday July 03 2020, @05:19PM (#1015794) Homepage Journal

    This is why I'm very wary of package managers for languages, especially in a professional setting. The only one that seems to do a decent job is Rust's cargo, because it builds from source, but even that's not foolproof or secure.
    That said, Python's pip is definitely easy, and sometimes that's all you need. However, it's definitely not something you should rely upon.

    As for the horrors of npm and JavaScript, that entire pile of trash is so noxious that I won't touch it at all. May it fry in hell for all eternity.

    --
    "It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
    • (Score: 0) by Anonymous Coward on Friday July 03 2020, @07:22PM

      by Anonymous Coward on Friday July 03 2020, @07:22PM (#1015825)

      This is why I'm very wary of package managers for languages, especially in a professional setting.

      Agree. They're usable for rapid development but the dependency chain becomes too large, too quickly. This is also a problem for linux distros, I have gtk libs on a headless server as dependencies of graphviz that was required for the documentation of something I was asked to build from source. The clusterfuck is real!

      As for the horrors of npm and JavaScript, that entire pile of trash is so noxious that I won't touch it at all. May it fry in hell for all eternity.

      They're tools. I've been having fun with a small nim project by compiling to js and running in quickjs. [bellard.org] Am I going to use it... well I could compile quickjs to wasm using emscripten and potentially run my project in a browser. Am I going to do that - no! Am I amused it's possible - yes!

  • (Score: 1, Disagree) by hopdevil on Friday July 03 2020, @05:57PM

    by hopdevil (3356) on Friday July 03 2020, @05:57PM (#1015809)

    A vulnerability in a 3rd tier dependency does not immediately equate to a vulnerability in the application (or 2nd tier dependency for that matter). If you go into the analysis with that mindset, naturally you will find that a single vuln in a core library *may* affect many projects, but this is a meaningless metric. They basically measured how often core libraries are used and have an identified vulnerability.

    As a security researcher I'd rather look at code that is commonly used by everyone rather than a project that only has a dozen downloads.

    Don't version lock your code, fix how your code uses APIs that change over time and rebuild often.

  • (Score: 0) by Anonymous Coward on Friday July 03 2020, @07:23PM (2 children)

    by Anonymous Coward on Friday July 03 2020, @07:23PM (#1015826)

    Oh bullshit. When counting mobile devices, their market share is down around 30% and that's not enough to command monopoly rents. They've lost the server market, like they lost the phone market, if Netcraft is any indicator. It shows around 4.5% and declining. They're betting what's left of their farm on Azure, but Azure continues to lose money [medium.com] and pretty soon the FTC will have to step in and address the shell game going on.

    Azure has to market Free and Open Source software, but it has only become part of their marketing because it is the only way they can even attempt to bring Azure into relevancy. While that is going on, they've increased their attacks against Copyleft via Snyk, Blackduck, and other proxies. Their use of proxies for software patent attacks has only increased. The whole indemnification scam they have for Azure is about baiting patent trolls (NPEs) to buy patents from them in exchange for a contract prohibiting going after the one or two Azure customers out there. Don't underestimate the harm and cost caused by the use of software patents.

    • (Score: 0) by Anonymous Coward on Friday July 03 2020, @11:11PM

      by Anonymous Coward on Friday July 03 2020, @11:11PM (#1015904)

      if Netcraft is any indicator. It shows around 4.5% and declining

      Netcraft confirms. Azure is dying.

    • (Score: 0, Disagree) by Anonymous Coward on Saturday July 04 2020, @02:05AM

      by Anonymous Coward on Saturday July 04 2020, @02:05AM (#1015961)

      The link claims MS is losing money on the cloud because they bet on Moore's Law continuing. But if it slows, it also slows for the competition.

(1)