Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday May 16 2017, @10:31PM   Printer-friendly
from the deepmed dept.

Google's use of Brits' medical records to train an AI and treat people was legally "inappropriate," says Dame Fiona Caldicott, the National Data Guardian at the UK's Department of Health.

In April 2016 it was revealed the web giant had signed a deal with the Royal Free Hospital in London to build an artificially intelligent application called Streams, which would analyze patients' records and identify those who had acute kidney damage.

As part of the agreement, the hospital handed over 1.6 million sets of NHS medical files to DeepMind, Google's highly secretive machine-learning nerve center. However, not every patient was aware that their data was being given to Google to train the Streams AI model. And the software was supposed to be used only as a trial – an experiment with software-driven diagnosis – yet it was ultimately used to detect kidney injuries in people and alert clinicians that they needed treatment.

Dame Caldicott has told the hospital's medical director Professor Stephen Powis that he overstepped the mark: it's one thing to create and test an application, it's another thing entirely to use in-development code to treat people. Proper safety trials must be carried out for medical systems, she said.

We are going to see many more stories like this.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by kaszz on Wednesday May 17 2017, @04:04PM (1 child)

    by kaszz (4211) on Wednesday May 17 2017, @04:04PM (#511174) Journal

    if Google or anyone else made money from the improper use of data there seems a strong case for civil prosecution.

    It may be that they didn't make money out of this particular event. But unless the knowledge gained, code produced etc is shared with the public. It's all the same people provide the data and megarich gains the profit.. from the people.

    So at the minimum the British citizens now own this AI and can use it for free to make better prediction for who should get an extra checkup etc.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Wednesday May 17 2017, @04:37PM

    by Anonymous Coward on Wednesday May 17 2017, @04:37PM (#511201)

    Can the data be poisoned to suggest daily & mandatory proctology exams for the entire Google and related workforce?