– By Matt Skoufalos –
Like medical imaging itself, artificial intelligence (AI) and machine learning technologies have captured the collective imagination of the health care space not only because of their seemingly limitless potential to improve the universal standard of care, but also because of the complementary value they represent to various branches of medicine.
Taken together with the data-driven nature of the 21st century world at large, and AI is “something every industry will have to pursue,” said Wesley Gilson, In Vivo Lead for Artificial Intelligence at Siemens Healthineers North America. The task for those in medical imaging is learning how best to leverage AI to reduce errors and solve problems faster without being overwhelmed by the volume of newly available information.
“AI was maybe the hottest buzzword of 2019, not just in health care, but across the board,” Gilson said. “Health care digitalization continues to grow with more and more data sources being introduced every day.”
As that additional development of data sources progresses, however, the health care field isn’t adding a correlative number of physicians and radiologists to be able to address them, Gilson said.
“Now you’re putting a larger demand on them,” he said, “and as you put more burden on physicians, there’s an increase in error rate.”
But Gilson believes that as AI-driven products introduce that wrinkle, they also have the potential to mitigate the mistakes that can come from reading through large amounts of data. By getting to significant findings faster and eliminating the need for manual testing, AI-driven solutions can offer an abundance of quantifiable information earlier in the diagnostic process, supporting radiologists’ workflow and avoiding the burnout that often is a precursor to such errors. They can also be leveraged to automate some of the manual work of the job, the value of which can be time-consuming and not always yield meaningful results.
“If we can change practice management in the ordering of radiology exams, there are opportunities for AI to help in that process; making sure you get the right study read by the right radiologist for the right reasons,” Gilson said. “There could also be opportunities to explore things like triaging: an AI solution that could do a rapid read of imaging data could triage that study to a higher order in the work list, and earlier intervention could be done for that patient. It’s not taking away from a radiologist’s ability to read the images, but putting it in a priority list that highlights a potential critical finding for them to address.”
“Machine learning can help improve the operation of our systems and the quality of the data that come from our systems,” he said. “Beyond that, what can we do with the broader context of data being generated in the health care system? Where are opportunities for AI to help us really make actionable the large amount of data that we have now in our electronic health records, so we can improve the ability to input data and consume the data that’s in those systems?”
Michelle Edler, GE Healthcare senior vice president for enterprise imaging and care area solutions, said that AI-driven solutions address one or more of four practice areas: turnaround time in critical cases, applied time spent on image analysis, diagnostic quality and context for downstream clinicians. By creating what she described as “intelligent clinical context,” radiologists can get the tools to be more effective, and thereby able to devote more time to more complicated cases.
“The big thing we’ve been hearing from the radiology community is, ‘Bring me tools to help me do my job faster and better,’” Edler said. “That’s what they’re looking for, and that’s exactly what we’re trying to achieve.”
“The intention is that we can coordinate the image analysis of any image that comes into your PACS system from any modality,” she said. “We’re working to try to drive efficiency for the radiologist, we’re looking to give them vital tools to verify what gets detected in the image, and ultimately, we’re enabling quantification of downstream reporting.”
Susan Harvey, MD, vice president of global medical affairs in the breast and skeletal health division at Hologic Inc. expects AI to help radiologists, not replace them. The most effective solutions, Harvey said, will detect cancers earlier, “identifying the most relevant information in a sea of data, quickly, to enable accurate diagnoses and improved patient outcomes.”
AI-enhanced detection can also lead to fewer false positives while speeding diagnostic times, thereby eventually improving patient throughput and provider workflow, she said, but cautioned “there will be a learning curve for radiologists, as with the adoption of every new advancement.”
Cutting the elapsed turnaround time in a critical care situation allows radiologists to save more lives. AI can help prioritize cases by recognizing patterns in images to draw physicians’ attention to cases where time is of the essence, Edler said.
“Don’t let things sit in a queue, first-in, first-out situation,” she said. “Putting that AI into the device at the point of care as well as in the PACS system can help reduce the turnaround time in the detection and prioritization of these images.”
Similarly, AI-driven image analysis can cut down the time a radiologist may spend analyzing an image, from taking manual measurements to scrolling through a significant quantity of them in order to find the most pertinent ones. Automating some of that work, and training AI-driven software to detect previously unobserved patterns, may improve the speed of a diagnosis as well as identifying issues that may have previously gone undetected. Furthermore, the idea of creating a “richer context for the downstream clinicians” involves leveraging AI to “quantify volume impact” of their findings, Edler said.
“Today, radiologists may detect or classify a collapsed lung as trace, small, medium, large, but there’s no [way to identify] how much of the lung has been compromised with this collapse,” she said. “Quantifying some of the indications is time-consuming for a radiologist, but it’s easy for a computer to quantify the volume that’s been compromised.”
Given the breadth of the diagnostic imaging space, from the number of body parts to the number of image acquisition types and the kinds of diseases and conditions that may be categorized and analyzed in a clinical setting, AI-driven technology is fertile ground for innovation. Hundreds of startups are developing independent approaches to these issues, and established corporate entities are already covering ground in the space. But to truly optimize the algorithms that make AI as effective as possible, collaboration is critical.
“GE will never be able to cover every single one of these if we go at this in-house, organically,” Edler said. “We’ve been tackling the high-volume, high-frequency ones internal to GE, but the best way to provide that is to partner with those parties, and that’s exactly what we’re doing.”
Edler said GE is working to be “a partner of choice to startup companies as well as to academic institutions that want to build their own algorithms,” and supports “a development ecosystem” for clinical partners who want to develop their own in-house algorithms.
“We have tools to help them to go faster, whether that be a workbench to train and build algorithms themselves, or imaging reconstruction services we’re piloting with some health care facilities to help with the visualization of the output,” Edler said.
From a broad manufacturer perspective, Zack Hornberger, director of cybersecurity and informatics at the Medical Imaging and Technology Alliance (MITA), said the advanced imaging industry is incorporating AI technologies into its products “with the quadruple aim” of improving health outcomes at lower costs to deliver a better experience for patients and clinicians.
“When it comes to AI that’s developed well, you’re not going to see the clinician experience suffer,” Hornberger said. “Imaging’s been doing AI since before AI was AI. But AI really makes a big difference in things that happen behind the scenes right now. It’s also a lower cost on the system, and really, those sorts of seemingly small things have a big impact.”
To Hornberger, the collaborative approach that will drive the success of AI among vendors and clinicians is familiar to his organization. By involving a variety of interested parties in the conversation, MITA believes that such opportunities can be uncovered to improve the act of image acquisition, patient experience and practice workflow.
“Manufacturers are most excited about AI’s ability to improve the quadruple aim almost all at once,” he said. “Not often does technology come along that can address all those aspects in one fell swoop. Usually it’s the improvement of some at the cost of the other, and AI has really done well to improve them all dramatically and concurrently.
“What gets measured gets better,” Hornberger said. “That’s the received wisdom at this point. If we have more metrics, we can make better decisions. We can use data to create some of those solutions.”
The application of advanced data to the imaging suite isn’t only the province of AI, said Angelic Bush, director of radiology at the University of Texas Medical Branch at Galveston. Bush believes that measuring and managing the outliers in any aspect of practice data can create “a more precise, repeatable process for a timely performance,” whether that process is automated by machine learning or not.
“What are we considering advanced data?” Bush said. “The data exist; it’s about looking at them from a different perspective.”
As the old adage goes, when you’re a hammer, everything looks like a nail. If data is that hammer, Bush said, it’ll be used to strike at everything. But it will never have the capacity to build a house without the vision of an architect and the skill of a carpenter. Therein lies the work of radiologists and practice administrators: “to start rewiring the way we think and look at the data,” she said.
“We do need to get to the data easier,” Bush said. “We need to know what kind of data to get to, but we don’t know that until we start re-wiring our brain. It’s about really approaching every question from a lean process, and getting into that data and saying, ‘Let’s look at the data before making any assumptions.’ ”
“You have to pretend you don’t know anything,” she said. “Once we do that, we have to make sure we have the right tools to look at that data. You don’t want to manually extract this over and over and over again. That’s where automation really is going to make a difference.”
To Bush, any data can be re-evaluated from the perspective of advanced analysis, whether it’s information about lines of service, staffing, equipment efficacy and more. By seeking out the extreme variances in the information that’s measured, directors and decision-makers can more effectively get to the root of driving their practices to greater success.
“Bypass RIS and get machine-level data,” Bush said. “That’s a great way to justify new equipment; to have a level-setting conversation with a technologist who may not be meeting your expectations; to see what is your optimal opportunity for performance.”
“I think we’re all looking at the same issues from different perspectives,” she said. “When we have those extreme variances, I’m absolutely sure it’s just as frustrating to the employees at the bedside as to the patients affected, and to cost avoidance.”
To identify baseline practice statistics and outlier data, Bush recommends a box-and-whisker plot approach. She describes it as “a very standardized, reproducible process with reliable outcomes,” whereby eliminating any of the extreme data points improves all quartiles measured. This approach to measuring key performance indicators illustrates visually where the outlying data are, which Bush described as an opportunity, whether addressing it with in-house team members or third-party vendors.
“Then you don’t have these extreme variances that we’re paying or getting reimbursed for,” she said. “Looking at those whiskers is the game-changer in operational efficiency and the data that allows us to address those. Most of us are not looking at the data that way yet.”