By Mark Watts
This was the topic of a post on a social media platform by a medical doctor. I will take Imaging AI seriously “when it can be sued for the mistakes it makes.” This was a statement about the speed at which AI has been adopted and the lack of governance.
When we started using computer-assistance diagnostics (CAD) for mammography in 2002 the workflow was feeding a film into a free-standing digitizer and the software scanned the film for suspicious finding. These circle, square and triangle indicators were viewed by the radiologist and compared to breast film.
The overlay of the image was not stored.
Years later, the CAD software was incorporated into the picture archive and communications system, (PACS). A DICOM standard was created for the overlay from the CAD. The intent was to aid the radiologist using pattern recognition software to bring the radiologist’s attention to an area to assist the radiologist.
The radiologist had sole authority to disregard the indicators on the film. What the radiologist did not have authority over was the permanence of the circle, square and triangle indicators in the PACS.
This was where the health care lawyers filled the gap. We set up a meeting and had an open discussion about intent, latency and potential unintended liability of leaving the marker on the image.
The breast care surveillance system established to catch breast cancer early is one of the most successful preventative health maintenance programs.
The over 40 population of women who go at the recommended screening interval of annually leaves a digital data trail. Each year when my wife gets her results, we breathe a collective sigh of relief that it is negative. (She lost her mom and her two sisters to breast cancer. Her sister was diagnosed at 45).
The intent of the CAD is to offer the radiologist assistance at the current state on the film as presented at that time. It is the radiologist who decides what recommendation to make for follow up at that moment.
Having a mammogram and receiving a clear reading on the interpretation is not an all clear for the rest of your life. It is a milepost on a long journey. If all circle, square and triangle indicators from CAD were stored in the PACS, somewhere at some time in the future a cancer would show up in that area marked.
This is the discoverability of a non-error that looks like malpractice. This was not the intent of CAD and AI in imaging.
AI in imaging has been trained to recognize patterns on a screen. It is tireless and deployable everywhere. It has measurable error rates that can be improved over time. It can bring subspecialty expertise to rural and remote locations. It can coordinate the delivery of care in a superhuman effort through communication and transparency. No ego, no turf wars and who will get paid for this advice.
Some would say CAD in mammography has become the standard of care. That not using it would be considered less than optimal.
I would like to see this transition of acceptance occur for AI in imaging with governance.
Tested, proven and studied to the degree that will take AI seriously.
When a provider can be sued for not using AI seriously.
This will take time and patience. CAD was not an overnight success.
Mark Watts is an experienced imaging professional who founded an AI company called Zenlike.ai.


