Waiting for the results of a medical test is excruciating. But what if you could instantly diagnose something—like that weird-looking mole you’ve been giving the side-eye—using an app on your smartphone to scan yourself?
That might be a reality in the very near future. With easy access to health care in mind, researchers at Stanford University developed an artificially intelligent diagnosis algorithm for skin cancer. First they made a database of 130,000 images of skin disease. Based on those images, they trained the algorithm to visually diagnose skin cancer.
The program didn’t just work, it worked incredibly well—as accurately as real doctors. It was tested against 21 flesh-and-blood, board-certified dermatologists. The algorithm matched the performance of the doctors in diagnosing cancerous skin lesions.
Survival rates for skin cancer detected early on is about 97 percent. But detecting it late can drop chances of survival to 14 percent. Using a computer program to easily detect skin cancer early could be game-changing, the researchers believe. The researchers created the program by feeding their algorithm—one they borrowed from Google that could already identify 1.28 million images from 1,000 categories—130,000 images of skin lesions representing more than 2,000 different diseases.
They wanted the algorithm to learn to tell the difference between a cancerous skin lesion and a patch of eczema, for example, just by “looking.” By loading in these images, the researchers were giving the algorithm something to compare future images of skin diseases to, a knowledge base for diagnosis.
“There’s no huge dataset of skin cancer that we can just train our algorithms on, so we had to make our own,”said Stanford’s Brett Kuprel, one of the researchers on the project, to the university. “We gathered images from the internet and worked with the medical school to create a nice taxonomy out of data that was very messy – the labels alone were in several languages, including German, Arabic and Latin.”
To test how well the algorithm matched with diagnoses by real dermatologists, 21 skin doctors looked at more than 370 photos of malignant carcinomas and malignant melanomas and said how they’d proceed with treatment for each patient. The dermatologists’ ability to identify a malignant lesion versus a benign lesion matched the algorithm’s ability.
The team hopes to make the computer program smartphone compatible—but further testing of the algorithm in a clinical setting is needed before the technology is made available to everyone, the researchers said.
“My main eureka moment was when I realized just how ubiquitous smartphones will be,” said Stanford’s Andre Esteva, another researcher on the team, to the university. “Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”
Katie Moritz is Rewire’s senior editor and a Pisces who enjoys thrift stores, rock concerts and pho. She covered politics for a newspaper in Juneau, Alaska, before driving down to balmy Minnesota to help produce long-standing public affairs show “Almanac” at Twin Cities PBS. Now she works on this here website. Reach her via email at [email protected] Follow her on Twitter @katecmoritz.