For years I've heard the promise of artificial intelligence (AI) being touted at radiology conferences and throughout our specialty. We've been waiting for AI to radically improve radiology, but where's the revolution?
The issue lies with our inability to see the true applicability of AI. The focus has been on finding places where algorithms can be best applied. This narrow view on AI's utility, without true workflow integration, is akin to a hammer in search of a nail.
Beyond searching for nails, we could be applying the AI hammer to smash the rocks that sit in a radiologist's path. After all, hammers not only can hit nails but also can smash rocks. Using Thor's Mjölnir as an analogy for AI, I think of AI's power as being at the will of the wielder: the radiologist. If we focus on larger, more systemic problems, we can break down obstacles that impede daily workflow and productivity, and thus help ease burnout.
Using the AI Hammer to Smash Rocks That Hinder Progress
Although AI is already useful in other areas of our lives, applications in radiology have not caught up. For example, current radiology viewing software is unable to suggest relevant prior imaging exams based on the pixels in the current imaging study. Fortunately, there are two strategies that can help radiologists wield the AI hammer and overcome current challenges that hinder progress in clinical practice.
Activity Pattern Identification
AI can use a radiologist's habits to crush a productivity rock. If we passively record each action that a radiologist performs, we can use AI to look for clusters of actions that they complete repeatedly and that can be automated.
For example, there are radiologists who repeatedly read aloud numbers on the screen to enter various data points into radiology reports (eg, CT radiation doses and DEXA T-scores). Used properly, AI would notice that for every CT, the radiologist viewed the image of the dose report and dictated the dose length product (DLP) from that image into the dose field of the radiology report. And for every DEXA, the radiologist viewed the images of the lumbar spine and left hip and dictated the T-scores for those body parts into the corresponding fields in the radiology report. Once noticed, these tasks could be automated.
AI also could notice which tools are used most often in a given situation. For example, when a radiologist hovers her mouse over a pulmonary nodule, the next step is often to right-click, select the measurement tool, and then measure the nodule. AI could notice this pattern and from then on, whenever the radiologist hovered the mouse over a pulmonary nodule, the system could offer the appropriate measurement tool (in the same way that a smartphone offers to map a highlighted address). Even better, the system could offer to automatically measure the nodule.
Hanging protocols also offer an opportunity for AI. Automating hanging protocols is a seemingly low-hanging fruit, since radiologists commonly maintain their personal pattern for each type of scan (eg, lumbar spine MRI). AI could notice these personal patterns and ensure that each scan is hung according to the radiologist's individual preference, shaving valuable seconds and reducing mental load.
Currently, many radiology AI algorithms receive images as input and produce a diagnosis or predicted outcome as output. However, a different output could be even more useful. After receiving images as input, the output from AI could be additional information that radiologists need for image evaluation. With those details at their fingertips, radiologists could then skip the step of searching for information and instead jump straight to clinical decision-making.
For example, when a radiologist is dictating that hand radiographs show erosions, natural language processing (NLP) could automatically search prior reports for erosions elsewhere in the patient's body. Even better, NLP could search prior reports for terms known through an ontology to be related to erosions (such as rheumatoid arthritis, erosive arthritis, or gout).
Additionally, while a radiologist is examining a body part where surgery has been performed, computer vision (which enables machines to see, process, and analyze images) could find and display the most recent prior preoperative images of that body part showing the primary mass. This would allow the radiologist to easily compare the imaging characteristics of soft tissue in the postoperative location with those of the primary mass and decide how suspicious to be about recurrence.
Radiologists are the most tech-forward physicians — early adopters who seek transformative technologies. Yet, there remains so much untapped potential for AI in radiology.
Now is the time to seize the promise that AI holds, applying those capabilities in broader ways. AI can use pattern detection and information integration to shatter the monoliths of repetitive tasks and disjointed information that stand in the way of greater productivity and efficiency. So, let's lift the hammer and start smashing!
Lead image: iStock/Getty Images
Image 1: Stephanie Hou, MD
Medscape Radiology © 2022 WebMD, LLC
Any views expressed above are the author's own and do not necessarily reflect the views of WebMD or Medscape.
Cite this: How Can Artificial Intelligence Help Radiologists? - Medscape - Sep 20, 2022.