Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday February 25 2020, @09:02AM   Printer-friendly
from the don't-add-them-to-begin-with dept.

Why fixing security vulnerabilities in medical devices, IoT is so hard:

When your family opened up that brand-new computer when you were a kid, you didn't think of all of the third-party work that made typing in that first BASIC program possible. There once was a time when we didn't have to worry about which companies produced all the bits of licensed software or hardware that underpinned our computing experience. But recent malware attacks and other security events have shown just how much we need to care about the supply chain behind the technology we use every day.

The URGENT/11 vulnerability, the subject of a Cybersecurity and Infrastructure Security Agency advisory issued last July, is one of those events. It forces us to care because it affects multiple medical devices. And it serves as a demonstration of how the software component supply chain and availability of support can affect the ability of organizations to update devices to fix security bugs—especially in the embedded computing space.

URGENT/11 is a vulnerability in the Interpeak Networks TCP/IP stack (IPNet), which was licensed out to multiple vendors of embedded operating systems. IPNet also became the main networking stack in Wind River VxWorks, until Wind River acquired Interpeak in 2006 and stopped supporting IPNet. (Wind River itself was acquired by Intel in 2009 and spun off in 2018.) But the end of support didn't stop several other manufacturers from continuing to use IPNet. When critical bugs were discovered in IPNet, it set off a scare among the numerous medical device manufacturers that run it as part of their product build.

The average medical or Internet of Things (IoT) device relies on multiple free software or open source utilities. These pieces of software are maintained by any number of third parties—often by just one or two people. In the case of Network Time Protocol (ntp)—software that is in billions of devices—its code is maintained by a single person. And when the OpenSSL Heartbleed vulnerability came out in 2014, the OpenSSL project had two developers working on it. While there are many more developers working on it now, the Heartbleed crisis is emblematic of what happens when we use free software in our devices—the software gets adapted, not really patched, and not really maintained on the device, and little benefit goes back to the project.

The S in IoT stands for Security


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Rich on Tuesday February 25 2020, @01:50PM

    by Rich (945) on Tuesday February 25 2020, @01:50PM (#962345) Journal

    Free or non-free does not matter here, and no one cares.

    No one cares: About 15 years ago, I updated an old Microware OS/9 68K system to an embedded PowerPC with Linux. No one in about three echelons above me as contractor had any detail knowledge about the licensing issues. So I ensured all is well, told them so, packed all upstream sources, all our patches, and all that was statically linked to LGPL (a good bit, because dynamic linking C++ was broken at the time) onto a CD, told them to archive it well, and hand it out everyone asking for the source. Just before shipping, someone from a remote corner of the big company got involved, noted all was well and added a printed copy of the GPL with an offer for the source to the packing list. 10 Years later, they must have had an audit and called me about how I dealt with the licence. I re-told them the history, they found the CD, and the auditors seemed to be happy. To my knowledge, no customer ever asked for the sources.

    It does not matter: If anyone had the full sources, they could not release a fix they made, because they need to go through regulations. To pass that, they would basically need the whole company fabric to show they adhered to the processes that the software works. There is of course the "COTS" (*) magic spell, which could be creatively applied, but the effort needed to deploy a local fix is impossibly high. It might work with a foundation of device owners pooling together for such things, but I haven't heard of this happening.

    "COTS" means "Commercial Off The Shelf", which allows manufacturers who have gone though painstaking tests of their own software to (more or less) let it run on whatever version of Windows MS feels it might ship on any random day.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2