By Mark Watts
Editor’s Note: The four-part series on FDA post deployment requirements for Software as a Medical Device will continue in April.
I wrote my first article for Imaging Community Exchange (ICE) Magazine, “Will AI Replace Radiologist” for the January 1, 2020, issue. I am reviewing it today, more than 5 years later. The recent paper “The Effect of AI on the Radiologist Workforce: A Task-Based Analysis” by Curtis P. Langlotz, MD, Ph.D., a professor of radiology, medicine, and biomedical data science at Stanford University, is a summary of the task-based analysis.
For nearly a decade, a shadow of uncertainty has loomed over the field of radiology. High-profile experts once predicted that machine learning would soon displace the human doctors who interpret our X-rays and MRIs. Yet, as we move into 2025, the conversation has shifted from “if” AI will be used to exactly “how it will reshape the daily grind” of the medical workforce. The study by Langlotz provides a groundbreaking, quantitative look at this transformation, moving past the hype to analyze the specific tasks that make up a radiologist’s day.
This research matters because it serves as a “canary in the coal mine” for other highly skilled professions. By looking at radiology – a field where three-quarters of new FDA-cleared AI devices are already concentrated – we can catch a glimpse of how innovation might alter the workflow of any job that relies on expert data analysis. The study utilizes a “task-based analysis,” an approach favored by labor economists, which breaks down a job into individual units of activity to see which specific pieces AI can handle better or faster than a human.
The findings are striking: the model predicts a 33% reduction in the total hours worked by radiologists over the next five years. While that number might sound like the beginning of the end for the profession, the sources suggest a more nuanced reality. Most of this time savings comes from streamlining the most repetitive and tedious parts of the job, such as drafting reports and filtering out “normal” scans that don’t actually need a human’s expert eye.
To understand how we get to that 33% figure, we first have to look at how these doctors spend their time. Research shows that while roughly 67% of their day is spent interpreting images, the rest is a mix of “protocoling” (deciding which specific test a patient needs), communicating with other doctors and explaining results to patients. AI isn’t just a single tool; the study identifies 15 different types of AI applications that target these specific moments in the workday, from the second a doctor orders a test to the moment the patient reads their report.
One of the biggest “productivity boosters” identified is automated report drafting. Instead of a doctor starting from a blank page for every patient, AI can now produce a high-quality initial draft describing the findings. This alone could increase productivity by up to 20% for complex scans like CTs and MRIs. It functions similarly to having a digital medical resident who does the heavy lifting of documentation, allowing the senior doctor to focus on verification and high-level diagnosis.
Another major shift involves “imaging study delegation.” In fields like mammography and chest X-rays, a huge portion of scans are “normal.” The study suggests that AI is now capable of identifying these unremarkable cases and pulling them out of the human queue entirely. In fact, the model assumes AI could handle up to 50% of screening mammograms and 40% of outpatient X-rays without human intervention. By putting the radiologist “out of the loop” for routine cases, the remaining workforce can focus on the patients who actually need help.
However, innovation isn’t always a time-saver. The study introduces the concept of “opportunistic screening,” where AI finds things the doctor wasn’t even looking for – like hidden signs of bone loss or heart calcium on a standard chest scan. While this is great for patient health, it actually increases the radiologist’s workload because they now must validate and document these extra findings. It’s a classic example of how technology can solve one problem while creating more work in another area.
Beyond the images themselves, AI is poised to act as a digital assistant for communication. New tools can summarize a patient’s messy medical history into a quick “cheat sheet” for the doctor or translate dense medical jargon into plain English for patients. This reduces the time spent on “non-routine communication,” such as playing phone tag with other clinicians or answering confused messages from patients about their results.
If AI is cutting work hours by a third, why aren’t radiologists losing their jobs? The answer lies in the explosion of medical imaging. Over the past decade, the number of scans performed has nearly doubled, while the number of radiologists has remained relatively flat. The sources argue that AI’s productivity gains will simply help doctors keep their heads above water as demand continues to rise. Instead of job loss, we are looking at a quality-of-life improvement where AI takes over the “needle-in-a-haystack” tasks that doctors find less interesting.
While the study’s quantitative approach is a major strength, it does have limitations. Much of the data is “U.S.-centric” and based on academic hospitals, which might not reflect how a small community clinic operates. Furthermore, because some of these technologies are so new, the researchers had to rely on their own expert judgment to fill in the gaps where published evidence was missing.
Critically, the study assumes that hospitals will adopt all these AI tools rapidly, which might not happen because of high costs or “resistance to adoption” regarding AI-only interpretations. There is also the unmodeled “AI governance” work – the time doctors must spend monitoring the AI itself to make sure it hasn’t developed errors or biases over time. Outside of the sources, it is worth noting that the legal responsibility for a misdiagnosis still rests on the human doctor, which may limit how much they are willing to “delegate” to a machine.
Looking ahead, the significance of this work is clear: AI is a partner, not a replacement. The transition will likely mirror the move from physical film to digital “PACS” systems years ago – a change that made radiologists much faster but didn’t make them obsolete. For students eyeing a career in medicine or tech enthusiasts imagining the future of work, the lesson is that innovation usually shifts the type of work we do rather than the amount.
In the end, the most important outcome of this AI revolution might not be efficiency, but “better care.” By automating the mundane, we give human experts more time for complex cases and meaningful patient interaction. As we move toward a future where AI handles the routine, the human radiologist’s role will likely evolve from a high-speed “image reader” into a high-level diagnostic consultant, ensuring that technology serves the patient, rather than the other way around.
Mark Watts is an experienced imaging professional who founded an AI company called Zenlike.ai.

