This article is more than 2 years old.

We’ve all probably done it at some point: snapping a photo of a rash or a mole and sending it to a friend or colleague for advice on the “next steps”. But often the “next steps” may include confusion, uncertainty, or just plain fear.

So imagine if AI could guide you to a range of possible diagnoses, and provide you with a more realistic set of possibilities (as opposed to just guessing) on what the skin lesion may actually be.

But Dr. Google has been keenly aware of the need for teledermatology to enhance our lives, and with their data indicating that over ten billion searches annually are made related to skin, hair and nail conditions, the natural progression was to take advantage of the high resolution camera in your smartphone and harness the power of AI to conduct a search of a skin condition that may be often difficult to describe in words alone.

Well, this past Tuesday at the Google’s annual developer conference, the tech giant revealed the fruits of 3 years of labor toward its goal of harnessing the diagnostic power of AI loaded into your smartphone to diagnose common skin conditions. The new app, Derm Assist, in many ways, represents a good start, armed with a C.E. mark as a Class I Medical Device in the E.U., with plans for an ongoing pilot study and launch there later this year.

But I qualify this by saying “start”, because it’s far from a finished product that’s ready for prime time as a “stand alone” diagnostic tool. The take away and most important point to remember—you still need to see your healthcare provider, or a dermatologist for confirmation.

Here’s how it works: After you launch the app, you use your phone’s camera to capture 3 images of the skin condition of concern from different vantage points. The app then asks you some questions about your skin type, the duration of the condition, and any other symptoms that may help to more accurately identify the condition. The app further analyzes this information and searches its database of 288 skin conditions to create a list of possible matching conditions that you can then research further.

And in keeping with its goals, Google is quick to assert that what the app provides you with is not the final diagnosis nor a definitive answer by any means, but a realistic set of possibilities after it has searched more than 288 skin conditions in its database. The thing is that while 288 skin conditions may sound like a lot of images, one might hope that it will be more in the near future —and likely will be—knowing that there are many more than 288 skin conditions that exist.

“The tool is not intended to provide a diagnosis nor be a substitute for medical advice as many conditions require clinician review, in-person examination, or additional testing like a biopsy, Google said in its press release. “Rather we hope it gives you access to authoritative information so you can make a more informed decision about your next step.”

The app actually emerged from research published last May in the journal Nature Medicine which provided efficacy of using deep learning systems (DLS) to identify common skin conditions that was comparable with the skill of U.S. board certified dermatologists.

And a more recent study in JAMA also provided evidence how non-specialist doctors (primary care physicians and nurse practitioners) can use AI-based technology to increase their ability to identify and differentiate various skin conditions. In the study, 40 non-specialist healthcare providers interpreted (de-identified) images of patients’ skin conditions from a telemedicine dermatology service, identified the condition, and made recommendations such as biopsy or referral to a dermatologist. Each clinician examined over 1,000 patients, and the providers used the AI tool for half of the patients, but didn’t use the AI tool in the other half.

What the researchers found was that the health care providers with the AI-powered tool were significantly more likely to arrive at the same diagnosis as dermatologists, compared to those examining cases without AI assistance. The chances of identifying the correct top condition increased by more than 20% on a relative basis, though the degree of improvement itself varied according to the provider. The take away was that AI allowed the provider to gain confidence in their clinical decisions and did not increase their likelihood to recommend biopsies or referrals to dermatologists as the next step. Of note, the likelihood to recommend biopsies or referrals to dermatologist actually decreased slightly in both categories in their small study

Google has attempted to be inclusive of all skin types in their database, but making that a reality may be a bigger challenge. Google explains that their model “accounts for factors like age, sex, race and skin types — from pale skin that does not tan to brown skin that rarely burns,” and have “developed and fine-tuned our model with de-identified data encompassing around 65,000 images and cases data of diagnosed skin conditions, millions of curated skin concern images and thousands of examples of healthy skin — all across different demographics.” 

According to reporting from Vice, “the researchers used a training dataset of 64,837 images of 12,399 patients located in two states.” The only issue was that out “of the thousands of skin conditions pictured, only 3.5% came from patients with Fitzpatrick skin types V and VI—those representing brown skin and dark brown or black skin, respectively.” Ninety percent of the database was made up of people with fair skin, darker white skin, or light brown skin, based on the results of the study. One argument is that as a result of “biased sampling”from its available database of images, the app could end up either over-or under-diagnosing people who are non-white.

The enthusiasm for development of such an app is certainly welcome in light of data indicating that over 2 billion people worldwide suffer from dermatologic conditions with a shortage of specialists to keep up with demand. However, empowering patients with such technology is only helpful if is accurate and helps to lead them to the correct diagnosis.

“It’s good to see the roll out of Derm Assist, but we’ll need to track the results to find out if it is providing net benefit, said Eric Topol MD, Founder and Director of the Scripps Research Translational Institute, Professor of Molecular Medicine, Executive Vice President of Scripps Research, and Editor-in-Chief at Medscape. “I suspect it will and we need AI for empowering self-diagnostics, at least screening measures, but we also need validation that it’s effective and not resulting in adverse outcomes such as missing an important diagnosis or inducing unnecessary biopsies.”

Roxana Daneshjou, MD, PhD, a dermatologist at Stanford cautioned in a tweet that while she was a “huge proponent of AI and...enjoy watching the field advance, I do think the Google team’s work has led to advances. But I haven’t seen data that makes me feel comfortable with putting this in the hands of patients or physicians.” As Topol commented, a clinical trial would be helpful. Daneshjou explains that “first, we have no prospective clinical trial in a clinical setting. The one ‘trial’ they reference uses the test set images from the Nature Medicine paper—we known the algorithm will perform well, but we aren’t seeing performance on a new external dataset.”


Daneshjou further raises the issue in another tweet that “from all the PUBICLY published data that we have, this algorithm has a HUGE lack of skin color (Fitz V and VI) in the test set of images used, “ pointing out that there is 0% Fitzpatrick skin type VI in the test set of the original paper. That’s the darkest skin tone.”

But a larger and looming issue with the dataset that Google based their algorithm on relates to labeling of the images. Daneshjou explains in a separate tweet that “the images in the original paper were largely labeled by consensus of dermatologists,” offering that “MOST cutaneous malignancies were not labeled based on biopsy results”— adding that “only 6 of the melanomas in the test set were biopsy-proven.” She further qualified that “the number needed to biopsy for a dermatologist to find a melanoma is about 7.5, so there are likely many training and testing images labeled as melanoma...that are not melanoma.”

But also important to realize is that there is no FDA approval for this app, and no indication from Google of intent to apply for approval at this time. As Daneshjou points out, “if you think the FDA will protect us, she refers to her recent paper published in Nature Medicine which exposes the “limitations of the evaluation process that can mask vulnerabilities of devices when they are deployed on patients.”

Yet, while the app may be a way to begin the evaluation of a skin lesion, the bottom line for many practicing dermatologists is that an evaluation in an office-based setting is the most definitive and conclusive for patients and provider alike.

“Google’s new AI app can be a helpful starting point for identifying common skin conditions,” said Michele S. Green, MD, a dermatologist in private practice on the Upper East Side of Manhattan, and affiliated with Lenox Hill Hospital. “For patients who may be using the search bar as their first resource, this tool can provide some useful information about dermatologic issues for a variety of skin tones and types.”

“That being said, there is no substitute for having an in-person evaluation done by a board-certified expert in dermatology who can best diagnose and treat your specific skin concerns,” she added. 

But there are also other caveats regarding the value of an in-person evaluation that people may not even realize.

“When patients come in for concerns about a spot on their skin or even a rash, I often find that the spot they are concerned about is benign but other lesions are of concern, even precancer or skin cancer, that they do not notice and would never otherwise point out to me,” said Dr. Doris Day, a dermatologist on the Upper East Side of Manhattan affiliated with NYU Langone Health. “These would be missed, diagnoses would be delayed and patients would suffer in these cases.”

“When I examine a lesion, I use special lighting and I also often also use a tool called a dermatoscope to evaluate the lesion, often from different angles to get the information I need for proper diagnosis—when images are sent, there is no way to get that tactile information,” she added.

“My experience with apps has been that it is important to err on the side of caution, which can lead to recommending biopsy or in-office evaluation more than may be truly needed. The time waiting for those appointments can be very anxiety-provoking for patients,” Day qualified.

“It may also lead people to think they have easy access to information which may lead them to over-use the technology and that can be anxiety provoking as well. They will depend on the device to give them information, this can end up leading to more doctors visits and biopsies than needed. Doctors may feel the need to biopsy something the AI said was of concern, even if their own judgment tells them otherwise, as a form of defensive medicine,” Day further explained.

“Finally, there are many times when my patients come in for evaluation of one thing and we end up discussing other unrelated concerns that end up being even more important for their overall health and well-being. There is a critical aspect to the doctor-patient relationship that is sacred and irreplaceable,” she concluded.

Follow me on Twitter or LinkedInCheck out my website