Technology to Target Skin Cancer Mortality

Jean-Christophe Lapiere, MD


October 03, 2017

Detecting Skin Cancer in the "Dermatology Desert"

The September issue of the magazine Women's Health included a special investigation[1] unveiling what they called, "dermatology deserts." These are regions of the country where it's impossible for women to see a dermatologist in a timely manner—even if they suspect that they have skin cancer or, in some cases, even after they've been diagnosed with melanoma and need to have it removed. The report builds on a 2017 study published in the journal JAMA Dermatology,[2] which examined the geographic distribution and density of dermatologists in the United States.

At a time when skin cancer rates are steadily rising, dermatologists have become a rare breed with fewer than 11,000 across the United States. This ever-increasing supply-demand mismatch is exacerbating what we're already experiencing—patients' lack of access to appropriate dermatologic care and time inefficiencies within our own practices because we aren't effectively triaging patients as much as we should.

I try to educate my patients and teach them how to track their moles and skin lesions, but it's a difficult lesson to stick. The American Academy of Dermatology recommends monthly skin self-exams for patients. In reality, most patients don't do these skin self-exams, let alone know how to do them or remember what they saw a month earlier. A month is a long enough time frame for the vast majority not to sustain the visual memory necessary to track whether something has changed or something new has appeared on their skin.

Where does this leave us? In a perfect world, dermatologists with trained eyes would be checking every patient's skin every month. But we live in the real world, where the number of specialists renders that scenario a sheer impossibility. We must leverage technology to help us amplify access to care, but we must use it in a meaningful way.

Technology to the Rescue?

Technology has been hailed as the great equalizer with its many advancements at breakneck speed in recent years. However, technology shouldn't be misrepresented as a silver bullet that turns patients into doctors or used simply for the sake of using it. Technology is extremely good at processing large amounts of data and producing results or even predictions; it is not as well suited (in its current state, particularly in healthcare) to synthesizing data and making empathetic judgments, which is ultimately what we, as doctors, do. That presents an opportunity—and, frankly, a necessity—for technology to play a role in both amplifying access to more patients and identifying a targeted pipeline of patients who really need our particular expertise and judgment.

There are tons of skin cancer-screening apps marketed to consumer patients that conduct one-off diagnoses of individual lesions to determine a user's risk for skin cancer or any other skin condition. Many of these apps are not very accurate, but that's not the point. Even if these apps were 100% accurate, their utility and efficacy are ultimately handicapped by what the user chooses to target and analyze. I've seen too many well-intentioned and concerned patients rush into my office to have me look at what they think are suspicious, cancerous lesions. Most are not.

We can't just put an algorithm or app in the hands of untrained patients and expect it to help us detect skin cancer early. It's not realistic to ask patients to track every mole and freckle on their bodies, the lesions that they judge to be suspicious, or those they can physically see (forget about the back!). Technology is part of the answer to help amplify our trained eyes, but it needs to be harnessed in a way that takes the onus of deciding what to analyze out of the hands and minds of patients.

Whole-Body Surveillance

The only realistic way to do that is by leveraging technology that is: (1) accessible; (2) total-body; and (3) specifically designed to track new or changing lesions. If technology can help us track changes in skin (something our current standard of care is ill-suited to accommodate), and the data can be sourced through accessible, everyday devices like smartphones (instead of expensive proprietary imaging systems), we can start to build a meaningful pipeline of patients who see us because they need to be seen, optimizing access to our specific medical expertise.

The pixel resolutions on smartphone and mobile tablet cameras are now such that field photography of entire body regions can be captured, and those images can be mole-mapped with high enough sensitivity to track longitudinal changes—new or changing lesions (Figure).

Figure. Two photos of a patient's skin, taken roughly a month apart. Detected spots are marked in blue, and the red circle on the right frame indicates a newly detected spot. Courtesy of SkinIO.

The ability to generate visual timelines of a patient's skin would go a long way toward providing more data to us as we see patients who have reason to be seen and, at the same time, more effectively educating patients about what matters on their skin. Because a dermatologist is not needed to take these photos, technology can open up access to otherwise underserved populations through community clinics, retirement centers, and even retail clinics.

We need patients to check their skin every month to look for changes, but we can't just take their word for it. With so many patients and so few of us, it makes no sense not to take advantage of technology.


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.