Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Tuesday April 24 2018, @01:47PM   Printer-friendly
from the brains-need-an-update-Tuesday dept.

A team of academic security researchers from KU Leuwen, Belgium, have discovered that medical implants like electrical brain implants are quite insecure devices because these have defected [sic] wireless interfaces.

Researchers identified that the security factor of these devices is pretty weak; the defects in their wireless interfaces can allow attackers obtain sensitive neurological data, administer shocks and intercept confidential medical data, which gets transmitted between the implant and the connected devices that are responsible for controlling, updating and reading it.

[...] By hacking neurostimulators, an attacker can cause irreversible damage to the patients by preventing them from speaking or moving. The hacking may also prove to be life-threatening, wrote the Belgian researchers in their paper that provide details about the research findings.

Source: Hackread

The research paper in PDF form. [DOI: 10.1145/3176258.3176310]

From the abstract:

Implantable medical devices (IMDs) typically rely on proprietary protocols to wirelessly communicate with external device programmers. In this paper, we fully reverse engineer the proprietary protocol between a device programmer and a widely used commercial neurostimulator from one of the leading IMD manufacturers. For the reverse engineering, we follow a black-box approach and use inexpensive hardware equipment. We document the message format and the protocol state-machine, and show that the transmissions sent over the air are neither encrypted nor authenticated. Furthermore, we conduct several software radio-based attacks that could compromise the safety and privacy of patients, and investigate the feasibility of performing these attacks in real scenarios.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by crafoo on Tuesday April 24 2018, @03:15PM (5 children)

    by crafoo (6639) on Tuesday April 24 2018, @03:15PM (#671184)

    I guess I expected much more care to be taken in wireless signals transmitted to something that directly affects a patient's brain operation. In a world where Hollywood movies are protected by the best encryption available, and publishers are demanding end-to-end encryption of digital signals to play back their entertainment.. brain interface devices are wide open, without even authentication of the signal's origin? What a strange world.

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1, Interesting) by Anonymous Coward on Tuesday April 24 2018, @03:58PM (2 children)

    by Anonymous Coward on Tuesday April 24 2018, @03:58PM (#671201)

    Literally everything in medicine is proprietary and security by obscurity. There's a reason your hospital runs on out-of-date Windows - and we're talking XP or 7 here, never later than 7 - because their suppliers and software providers can't be bothered to make anything up to date or standards-compliant. The joke is really that they claim to care about information security / protected health information, then shoot themselves in both feet right at boot time.

    Of course the devices are no different.

    It is worth noting, however, that for implanted stuff running on super low power batteries, adding encryption is nontrivial and may well tank the battery life. Telling patients the new version lasts half or a third as long as last year's version is not going to go over well.

    • (Score: 2) by HiThere on Tuesday April 24 2018, @05:14PM (1 child)

      by HiThere (866) Subscriber Badge on Tuesday April 24 2018, @05:14PM (#671232) Journal

      It's actually worse than that, they don't just run an out of date MSWindows machine, they run one without many of the security patches. Because it would cost a lot to get the updated one certified...and the manufacturer is the one who would need to do the certification, and he'd rather sell you a new machine (that also wouldn't get updates). And some of those things are *EXPENSIVE*.

      It a combination of bureaucracy and perverse incentives. But since any patch might break the drivers, there is no obvious way forwards except keeping them all isolated from the internet. Unfortunately, that would be inconvenient.

      But with medical devices, I think they're made by people who considered IoT to be insanely secure.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 0) by Anonymous Coward on Wednesday April 25 2018, @04:33AM

        by Anonymous Coward on Wednesday April 25 2018, @04:33AM (#671514)

        But since any patch might break the drivers, there is no obvious way forwards except keeping them all isolated from the internet. Unfortunately, that would be inconvenient.

        And also ineffective. There is always at least one boob with a USB stick around, cf. Stuxnet [wikipedia.org].

  • (Score: 1, Funny) by Anonymous Coward on Tuesday April 24 2018, @04:04PM

    by Anonymous Coward on Tuesday April 24 2018, @04:04PM (#671204)

    The first thing I thought when I was this was holy shi—

    What was I doing? Why is this comment box open? I'm certain there's nothing to worry about. Only a paranoid lunatic would think it could be used for mind contro— just a moment — just a moment — Oh! You also seem unaware of how important protecting our intellectual property is. Even now, Russian hackers could be giving the Chinese our most advanced summer romcom research. Protecting our romcoms is of utmost importance!

  • (Score: 4, Interesting) by JoeMerchant on Tuesday April 24 2018, @10:49PM

    by JoeMerchant (3937) on Tuesday April 24 2018, @10:49PM (#671391)

    Ha! My first day on the job with a Neurostim company (in the very early 2000s) I was shocked (not literally - that happens at defibrillator companies) to learn that the device comm protocol had an 8 bit and "open" checksum. Not only was the checksum only 8 bits, but it was also a simple calculation which an attacker could/would guess easily by just observing a couple of transactions. So, program the device to 1mA - that's a 4 byte sequence. Program it to 2mA - another 4 byte sequence. Company approved programmer software (and FDA permission to market for human use) limits current to a maximum of ~2.5mA, but... if you just identify what changed between the 1 and 2 programming sequences, you could easily plug your own RS-232 port into the company programming wand and program 4 or 8 milliamps which has the potential to do several very bad things, but mostly is just really painful.

    Feeling the need to step forward and voice my concerns, I was eventually (on the first day, before lunch) put face to face with the engineer / legal expert witness who blew the same smoke up my ass about how the mathematical analysis he did found that it would be so many billions of years before a random error caused an adverse programming event. Yeah, I'm not buying what you're selling, but I'm not the customer. So, 18 months later he, I, and the programmer software lead were called into a room to investigate a rash of reports of inadvertent programming error events with associated painful stimulation. Turns out that the programming code for 8mA stim is just like the programming code for a common lower stim value, except that the 8mA code ends in all zeroes (including that 8 bit checksum), so accidentally moving the programming wand away too early lead to programming the device to max (and off-label) power. A) we worked out a longer programming sequence that used confirmation before activation to prevent this from happening and a fix was pushed to the field very quickly. B) the new device that came out a few years later sported an "upgraded" 16 bit (still non-crypto) checksum, so by now all the old devices have been replaced with marginally better ones. C) the programming wand only had an effective range of about 4 inches, so a malicious attacker would have to get pretty close to make anything interesting happen. D) nobody died as a direct result of the programming error events, but you would be shocked (in a different way from the victims) to learn how many times that accident happened in the field with 50,000 users of the device.

    --
    🌻🌻 [google.com]