By Mark Watts
Radiology administrators will understand this scenario: “Mark, we need to do a root cause analysis on an event.” This “event” was the misadministration of nutrients into the lungs of a patient.
An X-ray of the chest was acquired for nasogastric tube placement and processed. Before the radiologist report could get back to the floor, the NG tube was used. This is not the first time this has happened in health care. This is a rare “sentinel event.” Could an artificial intelligence (AI) system have prevented this?
The Joint Commission, an organization committed to the continuous improvement of health care for the public, in collaboration with other stakeholders, designates events as sentinel because they require an immediate investigation and response. Accredited organizations are expected to respond to sentinel events with a “thorough and credible root cause analysis (RCA) and action plan.”
The most used form of comprehensive systematic analysis among Joint Commission-accredited organizations is root cause analysis, a process for identifying variation in performance, including the occurrence or possible occurrence.
Systems design and training occur. They are documented in health care. Some examples are two forms of patient identification and the five rights for medication administration. The five rights are:
- Right Patient
- Right Drug
- Right Dose
- Right Route
- Right Time
Variation in performance of these best practices on a decision tree is the issue. Medication reconciliation efforts have created guard rails to assist with the five rights. At this end, it is still a human who decides to administer the medication. A real person who gets tired, distracted by personal issues and who is human.
On March 10, 2019, a 737 Ethiopian Airlines jet crashed shortly after takeoff. A newly installed flight control system repeatedly pushed the nose of the plane down. This system could not be disengaged and overwhelmed the pilots’ attempts to control the plane. When the flight control system triggered the dive for the fourth and final time, the pilots fought back by pulling back on their controls columns with 180 pounds of force, but the nose of the plane sank even more, and the jet flew even faster.
This is a rare example of a system designed to improve safety that caused a sentinel event.
Artificial Intelligence can help us recognize a blood clot in a pulmonary vessel, pulmonary emboli, but it will never care if a human has a pulmonary embolus.
I have developed, with a team of experts, a patented radiopaque non biological shaped marker paired with AI to recognize these markers to aid in the safe, correct placement of indwelling tubes and catheters. My system would not have prevented the sentinel event referred to above. AI cannot stop a person from using the nasogastric tube before the radiologist reported that the tube was incorrectly placed.
To err is human, to practice medicine outside of your scope or override safety procedures is an error that even AI can’t overcome.
Mark Watts is the enterprise imaging director at Fountain Hills Medical Center.