
By Mark Watts
A researcher at a leading pharmaceutical company had spent decades billing clients thousands of dollars per hour for his research judgment. He once believed his most powerful tools were his instincts and a pen. But today, it’s a chat window.
When a chemotherapy product raised a nuanced question about regulatory exposure in four jurisdictions, he typed the query into a generative AI platform his firm quietly licensed last year. Forty-five seconds later, it returned a synthesis of precedent, statute, and risk assessment more comprehensive than any junior associate could have produced in a week. He reviewed it, added a comment or two, and forwarded it to the client
His young associates, the ones taking on six-figure debt for the privilege of doing this work, never touched the file.
This is not a story about the future. It’s already happening in medicine, in law, in finance, and in engineering. The disruption of intellectual value chain is underway. And like every revolution, it begins quietly.
Will and can this happen in radiology and what will it look like?
A ”wet read” of stat diagnostic scans have been done by resident radiologists for years with over reads by faculty radiologists.
My thesis is that AI will replace the “wet read” step and generate the first pass of a report or triage for review. This could help with true acuity and maximize the productivity of the radiologist.
I see that the radiologist profession value is based on exclusive access to codified knowledge or repeatable process, this makes it vulnerable.
This perceived moat of time investment and historical relevance will not protect the status quos.
I believe this profession is close to being automated. Once something becomes teachable, it becomes learnable by machines. Once it becomes learnable, it becomes replaceable.
Today’s most revered occupations, law, coding, finance and medicine are deeply codified. They have spent decades standardizing best practices, benchmarking performance, reducing errors through systematic frameworks. It is this systemization that makes it at risk.
Google Med-Palm 2 is being tested in clinical settings to answer open-ended medical questions with high levels of accuracy. A leader in healthcare has stated that the role of the radiologist is shifting from sole interpreter to a partner with AI that can see the unseen. This organization in its magazine states AI systems can detect subtle, early-stage disease patterns in scans that are often invisible to the human eye. This changes the fundamental approach and enhances and potentially altersthe diagnostic process.
I see a future where tasks that justified junior associates are absorbed by machines. This pre-read radiologist can draft memos, mark and summarize findings and research at astonishing speed. The AI can even check the radiologist’s work and flag anomalies and suggest edits in near real time.
The future star radiologist will not be the one who memorized the most pathology patterns, it will be the one who can ask the AI the smartest questions.
This is the beginning of a new kind of expertise, one rooted not in what you know but who can work best with what the machine knows. •
Mark Watts is an experienced imaging professional who founded an AI company called Zenlike.ai.

