Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday May 16 2017, @10:31PM   Printer-friendly
from the deepmed dept.

Google's use of Brits' medical records to train an AI and treat people was legally "inappropriate," says Dame Fiona Caldicott, the National Data Guardian at the UK's Department of Health.

In April 2016 it was revealed the web giant had signed a deal with the Royal Free Hospital in London to build an artificially intelligent application called Streams, which would analyze patients' records and identify those who had acute kidney damage.

As part of the agreement, the hospital handed over 1.6 million sets of NHS medical files to DeepMind, Google's highly secretive machine-learning nerve center. However, not every patient was aware that their data was being given to Google to train the Streams AI model. And the software was supposed to be used only as a trial – an experiment with software-driven diagnosis – yet it was ultimately used to detect kidney injuries in people and alert clinicians that they needed treatment.

Dame Caldicott has told the hospital's medical director Professor Stephen Powis that he overstepped the mark: it's one thing to create and test an application, it's another thing entirely to use in-development code to treat people. Proper safety trials must be carried out for medical systems, she said.

We are going to see many more stories like this.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Wednesday May 17 2017, @04:37PM

    by Anonymous Coward on Wednesday May 17 2017, @04:37PM (#511201)

    Can the data be poisoned to suggest daily & mandatory proctology exams for the entire Google and related workforce?