Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday June 09, @02:30AM   Printer-friendly

Editors note: This article has been been *greatly* shortened; it is well worth reading the whole article. --Bytram

----------

This AI-powered "black box" could make surgery safer:

While most algorithms operate near perfectly on their own, Peter Grantcharov explains that the OR black box is still not fully autonomous. For example, it's difficult to capture audio through ceiling mikes and thus get a reliable transcript to document whether every element of the surgical safety checklist was completed; he estimates that this algorithm has a 15% error rate. So before the output from each procedure is finalized, one of the Toronto analysts manually verifies adherence to the questionnaire. "It will require a human in the loop," Peter Grantcharov says, but he gauges that the AI model has made the process of confirming checklist compliance 80% to 90% more efficient. He also emphasizes that the models are constantly being improved.

In all, the OR black box can cost about $100,000 to install, and analytics expenses run $25,000 annually, according to Janet Donovan, an OR nurse who shared with MIT Technology Review an estimate given to staff at Brigham and Women's Faulkner Hospital in Massachusetts. (Peter Grantcharov declined to comment on these numbers, writing in an email: "We don't share specific pricing; however, we can say that it's based on the product mix and the total number of rooms, with inherent volume-based discounting built into our pricing models.")

[...] At some level, the identity protections are only half measures. Before 30-day-old recordings are automatically deleted, Grantcharov acknowledges, hospital administrators can still see the OR number, the time of operation, and the patient's medical record number, so even if OR personnel are technically de-identified, they aren't truly anonymous. The result is a sense that "Big Brother is watching," says Christopher Mantyh, vice chair of clinical operations at Duke University Hospital, which has black boxes in seven ORs. He will draw on aggregate data to talk generally about quality improvement at departmental meetings, but when specific issues arise, like breaks in sterility or a cluster of infections, he will look to the recordings and "go to the surgeons directly."

In many ways, that's what worries Donovan, the Faulkner Hospital nurse. She's not convinced the hospital will protect staff members' identities and is worried that these recordings will be used against them—whether through internal disciplinary actions or in a patient's malpractice suit. In February 2023, she and almost 60 others sent a letter to the hospital's chief of surgery objecting to the black box. She's since filed a grievance with the state, with arbitration proceedings scheduled for October.

If you were having an operation, how much of the operation would you want an AI to do?


Original Submission

This discussion was created by martyb (76) for logged-in users only. Log in and try again!
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Sunday June 09, @10:06AM (2 children)

    by Anonymous Coward on Sunday June 09, @10:06AM (#1359924)
    Can my estate receive a multi-million dollar payout from the company responsible for making the AI which killed me? Human doctors get sued for malpractice, anyone with the guts to build such an AI should be prepared to face the same level of heat as a human doing the same job.
    • (Score: 2) by looorg on Sunday June 09, @11:54AM

      by looorg (578) on Sunday June 09, @11:54AM (#1359927)

      The box is a tool as I understand it, like any other surgery equipment. The "human in the loop" as they like to refer to it is still the one that is responsible that approve or performs the procedure. Until the day they hook this thing up to another machine that can somehow perform the surgery then it's still the human doctor holding the scalpel or pushing the button or clicking "ok" on the screen that is responsible, and the one that is then getting sued when you die on his operating table.

    • (Score: 1, Informative) by Anonymous Coward on Sunday June 09, @04:40PM

      by Anonymous Coward on Sunday June 09, @04:40PM (#1359944)

      You didn't actually read TFA, did you? The "black box", which will be in the OR, simply records what happens in the OR. The so-called AI will analyze the procedure, looking for errors and inefficiencies. After surgery, everyone sits down, and listens to the AI's analysis. Well, everyone who thinks that they are anybody, anyway. Administrative staff and the doctors will have access, maybe senior nurses, but the rank and file working people won't have access. If the doctor made a serious mistake, like leaving some bit of equipment inside the patient, the AI is probably going to catch that, then the doctor will catch hell, and admin will scramble to head off the malpractice suit which is coming. The whole thing is meant to help doctors improve - at this point in time.

      Like all other surveillance methods, the black box and AI will become mandatory at all hospitals, and at some point in time, the evidence collected will be used to prosecute doctors who make mistakes. The evidence will also be used to distance the hospital from the doctor's actions. But, doctors aren't supposed to be smart enough to figure any of that out. Right now, they are working to get doctors and other staff to accept the black box as a training tool.

  • (Score: 4, Interesting) by pTamok on Sunday June 09, @11:58AM

    by pTamok (3042) on Sunday June 09, @11:58AM (#1359928)

    There are two things being mixed up here:

    1) The presence of a 'black box' in an operating theatre
    2) Analysis of 'black-box' data by software claimed to be intelligent.

    Back boxes have improved air safety by huge amounts, together with air accident investigations that are specifically not about apportioning blame for legal actions, but about finding out the cause(s) of the accident and hence giving data to those who would change engineering and procedures to reduce the likelihood of future similar accidents. Black boxes can be a good thing. But they need the right 'just' culture around them. Surgery has learned a lot from air safety - including the use of checklists.

    Using computational means to reduce the need for human involvement in analysis (and, maybe. preserve anonymity) is potentially a good thing.

    One thing that operating theatres have is a steep command gradient between the surgeon and other staff. Air safety practices regard this as bad, and encourage flight decks where it is normal for junior people to be able to question the decisions of the commander, and take over responsibility for the flight if necessary. It is tough to get right, but far too many accidents have happened when a commander has made poor decisions, obvious to the more junior flight-deck personnel, but culture prevents questioning and correction. Certain flag-carrier airlines were notorious for this.

    AIs probably make good autopilots, but I'd like an experienced pilot available for when the autopilot gives up. The same is true for surgery - AI is likely to hanfdle routine better than humans, but be unable to cope with the unexpected. Unfortunately, experienced surgeons need to gain their experience by doing lots of surgery, and handing over to a robot surgeon will lead to skills atrophy. This is an issue for pilots, as well. I don't know how to address this, and many clever minds are looking at the problem.

    Improving safety with 'black-boxes' is a good thing. Using automation to do the easy stuff leads to loss of skills able to cope with the unexpected, so I'm undecided on that score.

  • (Score: 2) by krishnoid on Sunday June 09, @09:14PM

    by krishnoid (1156) on Sunday June 09, @09:14PM (#1359962)

    They decrease infection rates [newyorker.com] significantly. Sure would be nice if they could start by using the AI to provide feedback only to the surgical team for a few years, and then make it more public. Given the choice, I'd rather not add more stress and worries to a surgical team that's going to be cutting me open -- notably if they've experienced it through multiple surgeries over time before they get to me.

    I'd prefer if the AI can help them follow the checklist during the surgery itself. Or better, someone whose sole, single responsibility is to watch the procedure, check that everything on the checklist is happening and in order, and be able to speak and display into the operating theater if they notice something was skipped, done out of order, or done incorrectly.

(1)