Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Saturday January 28 2017, @03:19PM   Printer-friendly
from the Watson-come-here dept.

Machine learning can compete with dermatologists when it comes to diagnosing skin cancer, to an extent:

A group of Stanford researchers has trained one of Google's deep neural networks on a massive database of images that show skin lesions. By the end, the neural network was competitive with dermatologists when it came to diagnosing cancers using images. While the tests done in this paper don't fully represent the challenges a specialist would face, it's still an impressive improvement in computer performance.

[...] For the medical images, the authors relied on Stanford's extensive records focusing on skin diseases. In all, they arranged more than 2,000 individual disorders into a tree-like structure based on their relatedness. So for example, all inflammatory problems ended up on one branch of the tree, all the cancers on another. These were further subdivided until the branching reached individual diseases. Inception was then given the tree and a set of nearly 130,000 images of these disorders and was trained to properly identify each. That's over 100 times the number of images as were used for training in the largest previous study of this sort.

The authors then tested the basic classification system against two dermatologists, using a new set of images where the diagnosis had been confirmed by biopsy. On the most basic level of classification—benign, malignant, or a type called "non-neoplastic"—the accuracy of the neural network was over 70 percent while the doctors were in the 60s. When asked for a more detailed classification among nine categories, the neural network had an accuracy of about 55 percent, which is similar to the numbers put up by the dermatologists.

For a further test, the team put Inception up against 21 dermatologists, asking them to determine whether an image contained a benign or malignant lesion. Here, the neural network consistently edged out most of the doctors, and it consistently performed a bit better than their average performance.

Before you conclude that doctors are obsolete, however, remember that neither they nor the algorithm did especially well when simply handed an image of any random skin disease and asked to identify it rather than being asked to provide a yes-or-no malignancy diagnosis. In the former case, the doctors have considerable advantages: they can examine the lesion from multiple angles, feel it and its surrounding tissue to get a sense of its texture and density, ask for additional tests, and evaluate their own uncertainty. Unlike Inception, they're not limited to looking at images.

Dermatologist-level classification of skin cancer with deep neural networks (DOI: 10.1038/nature21056) (DX)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by aristarchus on Saturday January 28 2017, @06:43PM

    by aristarchus (2645) on Saturday January 28 2017, @06:43PM (#459928) Journal

    We're expected to believe that some layman, with some hear-say gossip spread by other laymen, know what was removed from some unidentified, anonymous coward's acquaintance?

    Better than listening to some Fox-infected, no-too-BrightBart, Beacon reading poster with a pseudonym on a news aggregation site. So it could have been worse.

    Starting Score:    1  point
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3