There’s been a lot of hand-wringing about artificial intelligence and robots taking away jobs—by one recent estimate, AI could replace up to six percent of jobs in the U.S. by 2021. While most of those will be in customer service and transportation, a recent study suggests that at least one job requiring highly skilled labor could also be getting some help from AI: dermatologist.
Susan Scutti at CNN reports that researchers at Stanford used a deep learning algorithm developed by Google to diagnose skin cancer. The team taught the algorithm to sort images and recognize patterns by feeding it images of everyday objects over the course of a week. “We taught it with cats and dogs and tables and chairs and all sorts of normal everyday objects,” Andre Esteva, lead author on the article published this week in the journal Nature, tells Scutti. “We used a massive data set of well over a million images.”
The researchers then fed the neural network 129,450 images representative over 2,000 skin diseases gathered from 18 online galleries curated by doctor and images from the Stanford University Medical Center.
Nicola Davis at The Guardian reports that once the neural network had boned-up on skin diseases, the team presented it with 2,000 more images of skin problems that the network had not encountered before and whose diagnoses was confirmed by biopsy and by a panel of 21 dermatologists. The neural network did just as well, and sometimes better, as board-certified dermatologists at diagnosing disease from the images. When it came to melanomas, the neural network was able to classify 96 percent of malignant growths and 90 percent of benign lesions while human experts identified 95 percent of malignancies and 76 percent of the benign lesions.
Esteva tells Davis that the point of the work is not to replace doctors, but to help streamline the process of screening moles and lesions, which can take up a lot of time. “The aim is absolutely not to replace doctors nor to replace diagnosis,” he says. “What we are replicating [is] sort of the first two initial screenings that a dermatologist might perform.”
In fact, Scutti reports that the research may lead to a phone app that users could use to check abnormalities on their skin. That could also help bring dermatology services to areas of the world with limited access to health care and specialists. “Our objective is to bring the expertise of top-level dermatologists to places where the dermatologist is not available,” says Sebastian Thrun, founder of the Google X research lab and senior author of the study.
“My main eureka moment was when I realized just how ubiquitous smartphones will be," Esteva says in a press release. “Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”
But there are still some hurdles to overcome. Computational biologist Evelina Gabasova at the University of Cambridge tells Matt Burgess at Wired UK that the neural network may be good at recognizing high quality images, but that’s different than someone taking a snap of their rear in bad light using a cell phone. “The caveat is that, at the moment, [the software] is trained on clinical images, which may have different lighting but still have similar quality,” she says.
Dr. Anjali Mahto, a spokesperson for the British Skin Foundation tells Davis that the research is exciting, but similarly has some lingering concerns. Mahto points out that patients often aren't aware of skin cancer and doctor often find lesions during full-body exams that patients were not aware of.