AI 'Could Slash Chest X-Ray Turnaround Times'

Liam Davenport

January 22, 2019

Using an artificial intelligence (AI) system to triage chest X-rays (CXRs) based on their clinical urgency could significantly reduce turnaround times for critical and urgent findings to be reported on by an expert radiologist, say UK scientists.

Prof Giovanni Montana, chair in data science in WMG at the University of Warwick, Coventry, and colleagues used an AI-led modelling technique to develop two deep convolutional neural networks (CNNs) that could prioritise CXR images based on their clinical urgency in real time.

The research, published online by the journal Radiology on 22nd January, showed that, for more than 470,000 CXRs images and their reports, they were able to detect critical CXRs with a sensitivity of 65% and a negative predictive value (NPV) of 99%.

Moreover, it was estimated that the AI triaging system would reduce the average reporting delay for critical CXRs from more than 11 days to less than 3, while the delay for urgent images would drop from more than 7 to just over 4 days.

Reducing Delays

Prof Montana said in a news release that the results show "that alternative models of care, such as computer vision algorithms, could be used to greatly reduce delays in the process of identifying and acting on abnormal X-rays".

This, he said, is particularly the case for CXRs, "which account for 40% of all diagnostic imaging performed worldwide".

"The application of these technologies also extends to many other imaging modalities including magnetic resonance imaging and computed tomography," he added.

The authors explain that artificial neural networks involve layers of processing units that are trained on algorithms to create models able to process data to a high level of abstraction.

While these so-called deep neural networks have performed well on computer vision tasks in specialties such as radiology and dermatology, their application for AI-led independent image reporting "remains a controversial topic".

This is despite the potential to improve workflow and workforce efficiency.

Increasing Clinical Demands

As Prof Montana observed, "the increasing clinical demands on radiology departments worldwide has challenged current service delivery models, particularly in publicly-funded healthcare systems".

He said: "It is no longer feasible for many radiology departments, with their current staffing level, to report all acquired plain radiographs in a timely manner, leading to large backlogs of unreported studies."

Prof Montana pointed out that, in the UK, "it is estimated that at any time there are over 300,000 radiographs waiting over 30 days for reporting".

The researchers therefore set out to develop and test an AI system that would automatically triage adult CXRs based on their predicted urgency in real time.

They obtained 470,388 frontal CXRs of adult patients and their radiology reports at their institution between 2007 and 2017, separated into a training set of 329,698 images, a testing set of 41,407 images, and an internal validation set of 42,298 CXRs.

The radiology reports were analysed using a natural language processing system (NLP) to divide them into:

  • Critical, requiring immediate report

  • Urgent, requiring a report within 48 hours

  • Non-urgent, requiring a report within the standard departmental turnaround time

  • Normal, indicating no abnormalities

The team then developed two deep CNNs to automatically extract imaging patterns from pixel values and build up a real-time automated radiograph prioritisation system.

This was designed so that, when a radiograph is acquired, it is processed by the deep CNN, assigned a predicted priority level and inserted into a dynamic reporting queue that reflected the predicted urgency and the waiting time of radiographs already in the queue.

'Good' Performance

The researchers say that the NLP’s performance was "very good", at a sensitivity for accurately extracting radiological findings on normal CXRs of 98%, a specificity of 99%, a positive-predictive value (PPV) of 97% and a negative predictive value (NPV) of 99%.

For critical radiographs, the sensitivity of the NLP was 96%, while the specificity, PPV and NPV were 97%, 84% and 99%, respectively.

The performance of the AI prioritisation system overall was ranked as "good", with a sensitivity for normal radiographs of 71%, a specificity of 95% CI, a PPV of 73% and an NPV of 99%.

For critical radiographs, the AI system had a sensitivity of 65%, a specificity of 94%, a PPV of 61%, and an NPV of 99%. 

Analysis indicated that the AI system would significantly reduce the average delay for the reporting of radiographs as critical from 11.2 days to 2.7 days (p<0.001).

For urgent radiographic findings, the mean delay was reduced from 7.6 days to 4.1 days (p<0.001).

Turnaround Times

The team writes that, "even in simulations where 10% or 20% of radiographs would be reported out of order; most critical radiographs would have been reported within 24 hours of acquisition irrespective of referrer or clinical information".

They note, however, that the improvements in turnaround times "would remain unacceptable for North American practice".

It was also observed that the AI system would result in normal radiographs taking longer to be reported, with less than 40% turned around within 24 hours.

"These results belie the variance of reporting turnaround times of our historical data set," the researchers say, "and, more important, highlight that our organisational behaviours and clinical pathways have a substantial effect on reporting turnaround - eg, when in the 24-hour day the chest radiographs were requested, which department and/or referrer they came from, and the radiology staffing levels for reporting radiographs at different times of the day, week, or month within our hospital network."

The study was supported by the King’s Health Partners’ Research and Development Challenge Fund, a King’s Health Accelerator Award, the Department of Health via the National Institute for Health Research Comprehensive Biomedical Research Centre award to Guy's & St Thomas' NHS Foundation Trust in partnership with King's College London and King's College Hospital NHS Foundation Trust, the King's College London/University College London Comprehensive Cancer Imaging Centre funded by Cancer Research UK and Engineering and Physical Sciences Research Council (EPSRC) in association with the Medical Research Council and Department of Health, and Wellcome EPSRC Centre for Medical Engineering at King’s College London.

No conflicts of interest relevant to the article declared. 

Radiology 2019. doi: 10.1148/radiol.2018180921


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as: