AI Ups Precision in Breast Cancer Pathology

Kristin Jenkins

December 12, 2017

In the latest study of physicians vs machines in medicine, artificial intelligence (AI) outperformed pathologists using microscopes to evaluate small amounts of cancer that had spread to the lymph nodes in women with breast cancer, according to a report published online December 12 in JAMA.

Advanced or "deep learning" algorithms are potentially faster and more accurate than pathologists in this setting, conclude investigators led by Babak Ehteshami Bejnordi, Radboud University Medical Center, Nijmegen, the Netherlands.

Thirty-two algorithms were developed during the international Cancer Metastases in Lymph Nodes (CAMELYON16) challenge to create automated solutions for the diagnosis of sentinel lymph node (SLN) metastases in breast cancer, the authors explain.

In cross-sectional analyses, the top seven algorithms significantly outperformed a panel of 11 Dutch pathologists, who reviewed 129 whole-slide images to detect micrometastases. The diameter of the tumor-cell clusters of the micrometastases was 0.2 to <2 mm. Three of the 11 pathologists were specialists in breast pathology.

The parthologists took part in a 2-hour simulation exercise intended to mimic pathology workflow. For the pathologist who performed best, the mean area under the curve (AUC) was 0.810; for the algorithm that performed best, the mean AUC was superior, at 0.994 (P < .001).

For the panel of pathologists, the mean sensitivity for detecting macrometastases was 92.9%, and the mean AUC was 0.964. For detecting micrometastases, the mean sensitivity dipped to 38.3%, and the mean AUC dropped to 0.685.

"These findings suggest the potential utility of deep learning algorithms for pathological diagnosis, but require assessment in a clinical setting," the researchers say.

"Even the best performing pathologist in the panel missed 37.1% of the cases with only micro-metastases," they note. They admit that the task's 2-hour time frame "does not represent the work pace in other settings."

When the time constraints were removed and a pathologist was given unlimited time to review and interpret the slides, the results were comparable with those from the top five algorithms (mean AUC, 0.966 vs 0.960). However, it took the pathologist 30 hours to review all the cases, and 27.6% of specific metastases were not identified.

"The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist without time constraints (72.4%) at a mean of 0.0125 false-positives per normal whole-slide image," the study authors report .

Not Fair?

In an accompanying editorial, Jeffrey Alan Golden, MD, of the Department of Pathology, Brigham and Women's Hospital, Boston, Massachusetts, calls the study findings "exciting" but notes that the period available for the panel to review the slides was "unrealistically short." In routine practice, he says, a pathologist faced with an ambiguous finding would ask for additional sections or special stains.

"While no perfect control exists, it is clear the comparisons made in this study have limitations," points out Dr Golden, who is also Ramzi S. Cotran Professor of Pathology at Harvard Medical School. Whether the algorithms were equally effective at detecting all types of breast cancer is also unclear, he says.

The CAMELYON16 challenge, which ended in November, was organized in collaboration with the Institute of Electrical and Electronics Engineers' International Symposium on Biomedical Imaging.

To create algorithms, 390 contestants were given 270 hematoxylin and eosin–stained whole-slide images and corresponding glass slides of sentinel lymph nodes. The images were collected in 2015 from breast cancer patients at the Radboud University Medical Center and the University Medical Center Utrecht. Of a total of 270 whole-slide images, 110 contained nodal metastases, and 160 did not.

An independent test set of 129 deidentified whole-slide images were created for the competition. Forty-nine slide images had nodal metastases, and 80 did not.

Nearly 80% of the contestants used an advanced convolutional neural network to develop the deep learning algorithms. This method identifies metastases by evaluating the data from whole-slide images on a pixel-by-pixel basis. Such algorithms perform better than those based on manually engineered features, the researchers say.

"It has been shown that deep learning algorithms could identify metastases in SLN slides with 100% sensitivity, whereas 40% of the slides without metastases could be identified as such," the study authors explain. "This could result in a significant reduction in the workload of pathologists."

Radiology and pathology are image-rich specialties, and thus are ripe for infiltration by AI, notes editorialist Dr Golden. "Essentially everything we do has images," he points out in a video clip.

Essentially everything we do has images. Dr Jeffrey Alan Golden

However, unlike radiologists, who have been collecting images digitally for more than 25 years, pathologists are relatively new to AI. There are challenges to overcome, Dr Golden says. For one, proof that AI will add value is needed, because it will most certainly add cost.

Education will also be a challenge, he predicts: "We are going to have to train a whole generation of pathologists to become comfortable adopting computers and AI to assist them in making their workflow more efficient."

To accept AI, pathologists the world over will have to give up their microscopes, says Dr Golden. For many, he adds, "that will be a pretty scary place to go."

Rather than feeling threatened by AI, clinicians should embrace it, lead author Bejnoudi told Medscape Medical News.

"The era in which AI-based diagnostic tools perform as well as or better than humans in specific tasks has already started," he said. "This offers a great opportunity to empower clinicians by improving their efficiency and accuracy."

Bejnoudi and Dr Golden agree that AI won't replace clinicians, but it will likely change the way they work.

"Pathologists may, for example, spend less time on the interpretation of pathology slides but focus more on critically important tasks such as aggregating data from multiple sources to understand patterns that allow for more accurate and definitive, personalized diagnoses," said Bejnoudi.

Deep learning can help pathologists improve the efficiency of their work, provide better prognostication, and standardize quality, said Dr Golden. "AI may be just what pathology has been waiting for. The emergence of AI in healthcare, the reduced costs of digital data, and the availability of usable digital images are now in alignment for digital pathology to succeed."

AI won't take the place of human expertise, Dr Golden emphasized.

"Like electron microscopy, immunohistochemistry, and molecular diagnostics ahead of AI, there is little risk of pathologists being replaced," he said. "Although their workflow is likely to change, the contributions of pathologists to patient care will continue to be critically important."

At Radboud University Medical Center, a pathologist is now routinely using the results of a deep learning algorithm developed by Bejnordi and colleagues as part of the developers' ongoing investigation into how AI can improve diagnostic objectivity and efficiency.

The CAMELYON16 challenge was a 2-year project initiated by the computational pathology group at Radboud University. The second such challenge, CAMELYON17, has begun.

The new challenge moves the battle between pathologists and AI away from slide-level diagnosis "to direct assessment of patient outcomes, making it more relevant to the clinical setting," explained Bejnordi.

"This challenge," he added, "is still open to research groups worldwide."

The work was supported by the Seventh Framework Programme for Research–funded VPH-PRISM project of the European Union, Stichting IT Projecten, and the Fonds Economische Structuurversterking. Babak Bejnordi has disclosed no relevant financial relationships. Several coauthors of the article have disclosed relationships with the pharmaceutical and biomedical imaging industry. Dr Goldman has disclosed no relevant financial relationships.

JAMA. Published online December 12, 2017. Abstract, Editorial

For more from Medscape Oncology, follow us on Twitter: @MedscapeOnc


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.