Slash Boxes

SoylentNews is people

posted by hubie on Sunday March 31, @02:24AM   Printer-friendly
from the year-of-linux-in-the-pacemaker dept.

Arthur T Knackerbracket has processed the following story:

AIxCC is the two-year competition that DARPA announced last summer at Black Hat which challenges teams to build AI-based tools that automatically secure code used in critical infrastructure.

The new government agency partner is the Advanced Research Projects Agency for Health (ARPA-H), an independent research entity within the US National Institutes of Health.

By joining forces with the Pentagon's research arm, ARPA-H aims to promote the development of AI-based tech that can find and fix critical vulnerabilities in medical devices, biotech, and hospital IT systems, thus preventing destructive cyberattacks against life-saving equipment and facilities.

"Healthcare is both acutely being targeted, and it's been more and more targeted over the last few years," ARPA-H program manager Andrew Carney told The Register. "It's also uniquely sensitive to disruptions compared to many other critical infrastructure sectors."

[...] Most of America witnessed this first hand over the past month as a ransomware infection shuttered Change Healthcare's IT systems in February, knocking many pharmacies offline and preventing patients from receiving medication and other care.

"While the repercussions of this incident have been primarily – though not wholly – financial, what keeps me up at night is the possibility of a similar widespread attack directly affecting patient care and safety," US Senator Mark Warner (D-VA) said earlier this month. 

[...] This is where DARPA, partnering with APRA-H, comes into play to boost AI-enabled technology to secure healthcare systems — and sweeten the monetary rewards.

Competing teams receive challenges based on real-world software used in critical infrastructure systems. Bringing on APRA-H as a partner will help ensure the competition addresses critical flaws in healthcare. Plus, the research agency has committed an additional $20 million in rewards for the contest.

[...] While Carney can't give away too much about what the contests will involve, one that's already been announced is the Linux kernel challenge project [PDF]. "We know that the Linux operating system powers a lot of the devices and systems in many – if not all – of our critical infrastructure sectors," he said. 

This example challenge reintroduces a real-life vulnerability, CVE-2021-43267, in the Linux kernel's Transparent Inter Process Communication (TIPC) subsystem, which allows communication across clusters on a network. The challenge vulnerability is a heap-based buffer overflow flaw.

"And successes that we have against that challenge are implicitly very representative of the software that we would need to secure in these sectors at large," Carney said.

"And then specific to healthcare, if we start looking at medical devices, 60 percent of all medical devices run some flavor of Linux operating system," he added. "So once again, as competitors find and fix vulnerabilities in that example challenge, that translates into real-world safety, and better defended, safer systems."

Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by canopic jug on Sunday March 31, @04:27AM (1 child)

    by canopic jug (3949) Subscriber Badge on Sunday March 31, @04:27AM (#1351076) Journal

    The fine article points out, yet again, that the main problem is ransomware []. In other words, that means the problem is that of M$ products in the production environment and ones which are, even worse, likely connected to the net. Like in most situations, when you get rid of the managers pushing M$ products into the work place then you can eliminate M$ from the computing environment and thus solve 99% of the problems and, as it stands, all of the ransomware problems.

    DefCon's foray with AI in that context will surely come up with something intriguing, but probably ultimately useless if it does not address the core, systemic weaknesses such as the very presence of Windows in a production environment. As such, most counter measures against malware in hospitals and health care which I've read about in that department or heard from medical professionals are just the computer equivalent of rearranging the deck chairs on the Titanic. The absolute fist step in security systems is having it be open source [], that step is unavoidable. However, the solution is not a simple as slapping Devuan-stable on it and calling it a day. Medical systems, including the packages, have to be even more predictable and static than that. The distro has to be able to provide the exact same versions of files anytime anything is installed from the remote repo — each time, if needed, even if it is a decade or more later.

    Roland Hughes, a seasoned medical device developer has compiled a rough list of requirements [] for what a GNU/Linux distro, or other Linux distro, would need to be successful in that field of endeavor. Further it is important that the eventual distro's release cycle follow sound software engineering process instead of using test-driven development or "agile" both of which are known for leaving sloppy wakes of trouble. And even more importantly, it has to have as few moving parts as possible because every software component, down to the version, has to be managed, evaluated, and verified.

    Money is not free speech. Elections should not be auctions.
    • (Score: 3, Insightful) by Rich on Sunday March 31, @08:26PM

      by Rich (945) on Sunday March 31, @08:26PM (#1351130) Journal

      "AI Fixes To Critical Health-Care Flaws" is a serious US attempt in catching up to yesterday's "EU Leaders Want To Make Europe A Global Quantum Powerhouse" with the bullshit bonanza.

      I read through the Hughes article, because I'm in a somewhat similar project position, but I don't see his points as anything realistic - but neither his demands as something required. MATE is my favourite, but it'll be hard to get the world to standardize on that fringe desktop. It'd be rather what RasPi use for overwhelming presence and community backing in the embedded world. I also concur that QML is utter shite, but Qt is the only sane way of doing C++. NVIDIA drivers on the other hand are a no-go, for being not source-auditable (but so are most UEFI loaders).

      No one stops anyone from using a solid Debian/Devuan/Slackware base to set up a system that suits the company (maybe even with NVIDIA drivers, the last open Qt version, and exactly that bleeding edge package they want bugfixed. Make a package list, clone the repo for those packages, switch off updates (*), and you're set. It won't ever change. And if you DO update, you don't hope for upstream doing the right things, but roll them out under your full control with your own processes. This hasn't got to do with a distro as such, it's pure admin stuff. The target systems will need to be hardened in a specific way that has to be scripted, anyway - to a point where development becomes impossible. Therefore, the admin has to cater for different hardening profiles.

      (*) unless you want the user to be able to update in certain emergencies from a non-admin role without having the escalating credentials or requiring a $1000 technician visit on site. Admin 'apt' to taste, nothing distro related.

  • (Score: 2, Disagree) by Mojibake Tengu on Sunday March 31, @05:24AM (1 child)

    by Mojibake Tengu (8598) on Sunday March 31, @05:24AM (#1351078) Journal

    Do weapon systems have that unfunny ransomware problem too? Nuclear facilities?

    Had to ask. One absurd aspect of critical software are disclaimers of responsibility by software licenses. That's not acceptable.

    Make it personal. Force people from industry be responsible for critical software they sell, by law. No matter what shitcode they reuse without auditing.

    It took centuries to learn, but now industry standards work rather well in other industries, like machine or buildings construction.
    Why software engineering gets legal privileges unattainable by other engineering paradigms?
    You can't just proliferate dangerous software to machines which can kill people and disclaim liability.

    For the start, I recommend death punishment for any software failure which costed more than two human lives. Lifetime term for a single death by software malfunction.
    Additionally, those who malevolently attack medical platforms shall be burned at stake. There is nothing excusable on that. In a war, it would even be a war crime.

    Call it: "Cleansing for better future."

    Rust programming language offends both my Intelligence and my Spirit.
    • (Score: 4, Insightful) by canopic jug on Sunday March 31, @07:07AM

      by canopic jug (3949) Subscriber Badge on Sunday March 31, @07:07AM (#1351079) Journal

      Had to ask. One absurd aspect of critical software are disclaimers of responsibility by software licenses. That's not acceptable.

      The article is about Def Con, but ten years ago this year at a related conference, Black Hat, there was a presentation by Dan Geer about software liability entitled "Cybersecurity as Realpolitik" [] which covered the problems and the changes necessary to make liability happen. Poul-Henning Kamp of FreeBSD and Varnish fame has also weighed in on the topic. I've posted both links before, but the gist is that the vendors should be on the hook at the same level as other industries, with carve outs for practices which make software safer such publishing under Free Software licenses:

      Today the relevant legal concept is "product liability" and the
      fundamental formula is "If you make money selling something, then
      you better do it well, or you will be held responsible for the
      trouble it causes."  For better or poorer, the only two products
      not covered by product liability today are religion and software,
      and software should not escape for much longer.  Poul-Henning Kamp
      and I have a strawman proposal for how software liability regulation
      could be structured.

      0. Consult criminal code to see if damage caused was due to intent
         or willfulness.

      We are only trying to assign liability for unintentionally caused
      damage, whether that's sloppy coding, insufficient testing, cost
      cutting, incomplete documentation, or just plain incompetence.
      Clause zero moves any kind of intentionally inflicted damage out
      of scope.  That is for your criminal code to deal with, and most
      already do.

      1. If you deliver your software with complete and buildable source
         code and a license that allows disabling any functionality or
         code the licensee decides, your liability is limited to a refund.

      Clause one is how to avoid liability: Make it possible for your
      users to inspect and chop out any and all bits of your software
      they do not trust or want to run.  That includes a bill of materials
      ("Library ABC comes from XYZ") so that trust has some basis,
      paralleling why there are ingredient lists on processed foods.

      The word "disabling" is chosen very carefully:  You do not need to
      give permission to change or modify how the program works, only to
      disable the parts of it that the licensee does not want or trust.
      Liability is limited even if the licensee never actually looks at
      the source code; as long has he has received it, you (as maker) are
      off the hook.  All your other copyrights are still yours to control,
      and your license can contain any language and restriction you care
      for, leaving the situation unchanged with respect to hardware-locking,
      confidentiality, secrets, software piracy, magic numbers, etc.

      Free and Open Source Software (FOSS) is obviously covered by this
      clause which leaves its situation unchanged.

      2. In any other case, you are liable for whatever damage your
         software causes when it is used normally.


      Cybersecurity as Realpolitik [], by Dan Geer, 6 Aug 2014

      (The page that quote is from is very hard to find even if you know it is there. )

      But as for those who malevolently attack medical platforms, how would you define attack? From my perspective, the attack starts with the managers who decide against all best practices to deploy Windows™ in hospital settings. They are setting it up to get knocked down, and their only attempt at defense is to endlessly bleat the assertion that "everyone else is doing it like that" so it must be acceptable [].

      I think there is some movement towards software liability but FOSS projects, especially those creating libraries and modules, must get out there and fight. You can ignore the politics, but the politics won't ignore you. Software bill of materials (SBOM) is just the start and it is most essential that FOSS have representation at all stages, but especially the start to ensure that FOSS remains a viable option in the future and not banned or flat out illegal.

      Money is not free speech. Elections should not be auctions.