By Mark Watts
I am not saying that imaging artificial intelligence can insert a catheter or place an empathetic hand on a patient and say, “You have cancer and I am here to help guild you through this journey.”
I am saying radiologists and health care organizations that embrace artificial intelligence will replace those that do not.
“Do not touch the patient. Analyzing heart rhythm. Please wait. Preparing shock. Move away from the patient. Do not touch the patient, a shock will be delivered in 3, 2, 1.”
These are the voice commands provided by an automated external defibrillator (AED) in my CPR recertification class. I looked to my left and a 19-year-old registration clerk, a layperson in medical training, could accurately diagnose pulseless ventricular tachycardia and provide an advanced interventional treatment to save a life. The clerk’s skills are enhanced by the AED tool. This is the good kind of artificial intelligence. Right?
Artificial intelligence (AI) has joined the list of imaging buzzwords that include interoperability, vendor neutral archive and blockchain.
I would like to share, what is known, what is hype and work with readers to create useful AI imaging tools.
There’s been an explosion in AI recently because of the convergence of four exponentials.
The first exponential is Moore’s Law, every two years we have a doubling in computing performance.
The second exponential is that every two years we have a doubling of the amount of data that we have because machine learning algorithms are very hungry for data.
The third exponential is that we’ve been working on AI for 50 years or so. The algorithms are starting to get better and teach each other.
The fourth exponential is that every two years AI funding has doubled.
We now have the computing power, the data, the algorithms and a lot of people working on the problems. The surviving radiologists’ workflow will be supported by imaging artificial intelligence (IAI). A future radiologist will sit or stand at a reading workstation and assume IAI has optimized the pre-imaging phase, the image acquisition, the reporting and post-reporting tools. These main areas are where I see potential for AI use.
Clinical Decision Support – tools at the requesting stage which may guide clinicians to the appropriate single best test or suite of tests for a given presentation or differential.
Optimized Scheduling – both within an enterprise and with patients to route appointments to the most convenient and efficient location and scanner to enhance productivity.
Enhanced Digital Communication with patients (including electronic consent) – tools to better prepare a patient with information about what a test involves, how to prepare for it and the importance to their overall treatment journey.
The benefits of these measures would be to eliminate time wasted by poor scanner scheduling, reduce the incidence of no-shows, and pump-prime the information a patient needs for a scan to when they are more receptive rather than during the stressful period of attendance as well possibly as reduced time and support needs during the scan (for example expectations for positioning, etc.).
Image Acquisition Stage
AI-assisted image acquisition to reduce the time it takes to scan (for example multi-parametric MRI scans) and reduce the number of poor-quality images thus potentially also improving accuracy and need for recalls.
AI-assisted dose management – at a macro-level by reducing signal noise to improve image quality of lower dose scans, and at a patient level.
Real-time on-scanner image detection/analysis. This itself could have several potential benefits. During my years at the Mayo Clinic in Scottsdale, Arizona each CT exam was reviewed by a radiologist before the patient was taken off the table. Generally, radiologists’ readings are “cold,” that is when the patient is no longer in the radiology department or usually not even in the hospital (outpatient scanning). Benefits of on-modality analysis may allow stratification for a critical finding that requires immediate/urgent medical attention or abnormalities that require urgent/expedited reporting.
Normal scan – automated reporting of normal examinations for near-contemporaneous feedback to expedite management and provide earlier reassurance to patients.
A subset of the above might be detection of changes to known pathology (for example a nodule or cancer follow-up) with either automation of “no change” or prioritization of “significant change” findings.
Image Interpretation and Reporting
Examination-Routing: Intelligence worklist management to ensure that examinations are reviewed as quickly and efficiently as possible by the appropriate person based on rules such as:
- Urgent findings
- Key performance indicators/metrics
- “Normal” pathways as alluded to above
Optimized Presentation of Imaging (ready for reporting): beyond the bane of a radiologist’s life that is “hanging protocols” and “relevant priors” more broadly this would be bringing appropriate investigations, clinical information and findings outside radiology to the reporter’s attention to enhance quality and reduce time wasted from multi-source hunting.
Lesion Segmentation and Tracking: I recognize there are about a million algorithms in development that profess to do this, but instead of “App stores” requiring human intervention to pull individual pieces of software to run and then needing user input to validate each nodule, options could include:
- Hardwired into a natural workflow which (for example) automatically segments out lesions (across the entire image acquisition not just in individual body part models), measures them, detects changes in prior lesions and presents them as a summarized finding in the report.
- On-demand Analysis Aid: humans are generally poor at differentiating between true positive and false positives and so algorithm segments “nodules” presented to validate might lead to over-calls. Instead an interactive tool might be activated on demand to provide a “second opinion” on a region of uncertainty instead of pre-marking multiple regions for a person to accept or reject.
Image Analysis Support: This might involve, for example, access to image libraries with suggestions of possible diagnosis of appearances based on pathognomonic features. More specifically, this might involve radionomics features to help classify tumors. Another example might include an analysis of the attenuation, enhancement characteristics or MR-signal profiles and suggesting the most likely etiology based on these parameters. Of course, we should also remember the more prosaic analysis of pathology on plain X-rays.
Natural Language Processing applications might be employed in various guises such as:
- Improve the accuracy of voice recognition while reporting and correct typographical errors whilst reporting or deploy suggested-next methodologies to make reporting more efficient.
- Automatic generation of report summary based on the body of the text, including details such as auto-inserting TNM stage based on descriptors of pathology.
Report-Creation: The next step from assisted reporting would be independent report creation modules. We are already seeing some of these in development in the breast radiology space but possibilities include:
- Breast second reader applications – helping to address the massive shortage of radiologists.
- Full template reporting – as we discussed in the image acquisition phase, if the analysis deems an examination is normal there is no reason this could not generate an appropriate report thus potentially massively reducing the reporting burden of the normals. Indeed, this could equally work with (for example) X-rays for fractures – coupled with appropriate routing of the reports.
Clinical Decision Support: Access to latest pathways and protocols to ensure a radiologist’s advice conforms to current standards (for example for lesion/nodule follow-up guidelines).
This would involve various facets of automatic or optimizing routings of the report of its findings such as:
- Automatic notification to responsible clinicians of critical findings.
- Automatically scheduling a case to be discussed at the next appropriate MDT.
- Scheduling/requesting appropriate onward examinations based on the examination findings such as PET-CT or interval CT for nodules as per guidelines.
The aim of the radiological journey with IAI is that it should result in greater efficiency in the end-to-end pathway without increasing the administrative burden on users to deploy it. The net result would be faster and more efficient patient-centric imaging. By considering some of the fully automated outcomes, we could also seek to redress the massive differential between imaging demand and capacity.
AI is like Tony Stark’s Iron Man suit. It takes someone, like the registration clerk, and makes them into a superhero! And you could suddenly be doing things that are 10 times above your level and providing them much cheaper than anyone else could do it.
Therefore, I think IAI will replace some radiologists.
Mark Watts has over 20 years as an imaging professional with vast expertise in imaging informatics and IT issues. He has served in many roles in both hospitals and industry as a health care vice president, imaging director, and IT consultant. His knowledge and experience in the convergence of IT and imaging has made him a sought after author, speaker and consultant. He has authored a textbook on informatics and was a pioneer in the adoption and development of PACS and VNA technologies.