
By Mark Watts
My wife, Jennifer, and I were recently invited to tour Air Force One. “The White House in the Sky.” It was an impressive exhibition of well thought out purposefully deployed technology. I asked a representative crew member if autopilot was used to reduce the cognitive burden of flying great distances. “Yes, of course, but never without supervision,” “Never?” I asked? “Never.”
In both aviation and health care, the introduction of advanced technologies has revolutionized operations, increased efficiencies, and significantly enhanced safety and outcomes. One of the most notable advancements in aviation is the autopilot system, which has drastically reduced pilot workload and improved flight safety. Similarly, artificial intelligence (AI) in health care promises to transform patient care through advanced diagnostics, personalized treatment plans and improved administrative efficiency. Despite these advancements, the need for human oversight remains critical. This article explores the parallels between the necessity of human involvement in the use of autopilot systems in airplanes and AI in health care, highlighting why human expertise and intervention are indispensable.
Evolution and Impact of Autopilot Systems in Aviation
Autopilot systems have evolved from basic mechanical devices to sophisticated computer-driven systems capable of managing complex flight operations. Early autopilots could maintain a plane’s altitude and direction, but modern systems can handle virtually every aspect of flight, from takeoff to landing. Despite these advancements, pilots are trained to take control whenever necessary, ensuring the safety and success of every flight.
Even with advanced autopilot systems, pilots play a crucial role. They are responsible for overseeing the flight, making real-time decisions during unexpected events, and ensuring the overall safety of passengers and crew. Autopilot systems follow pre-programmed instructions, but human pilots are essential for interpreting data, managing unforeseen circumstances and applying judgment that no machine can replicate.
AI in Health Care: A Transformative Tool with Human Oversight
AI technology in health care has made significant strides, offering tools for early diagnosis, predictive analytics and personalized treatment plans. AI systems can analyze vast amounts of data rapidly, identifying patterns and providing insights that would be impossible for humans to process in a similar timeframe. Applications range from diagnostic imaging and pathology to electronic health records (EHR) management and patient monitoring.
Despite these advancements, health care professionals are indispensable in the application of AI. AI systems can provide valuable data and recommendations, but doctors, nurses, and other medical personnel are essential for interpreting these results, making final decisions, and providing compassionate care. AI lacks the ability to understand the nuances of human emotions, cultural contexts and ethical considerations that are integral to health care. Human professionals ensure that AI recommendations are applied appropriately, considering the broader context of each patient’s situation.
Parallels Between Autopilot Systems and AI in Health Care
Both autopilot systems and AI in health care rely on accurate data and pre-programmed instructions to function correctly. In aviation, autopilots use data from various sensors to maintain flight paths and perform maneuvers. In health care, AI algorithms use patient data to predict outcomes and recommend treatments. However, inaccuracies or anomalies in the data can lead to erroneous outcomes, necessitating human intervention to correct course.
Limitations of Predictive Models
Predictive models in both domains are based on historical data and predefined parameters. While they can identify trends and make predictions, they cannot anticipate every possible scenario. In aviation, unexpected weather conditions, mechanical failures or other emergencies require pilot intervention. Similarly, in health care, rare diseases, atypical patient responses or complex ethical dilemmas require human judgment and expertise.
Ethical and Emotional Considerations
Ethical and emotional considerations are crucial in both fields. In aviation, the safety and well-being of passengers are paramount. Pilots must make decisions that balance various risks and outcomes. In health care, decisions often involve ethical dilemmas, such as balancing the potential benefits and risks of a treatment or respecting a patient’s wishes. AI systems, while powerful, do not possess the ethical reasoning or emotional intelligence required to navigate these complexities. Continuous monitoring and adaptation are essential in both contexts. Pilots monitor autopilot systems and are prepared to take control at any moment, adjusting to new information and changing conditions. Similarly, health care professionals must continuously monitor AI systems, validating their recommendations and adapting treatment plans as new information emerges. This ongoing oversight ensures that both technologies serve their intended purpose without compromising safety or quality of care.
Aviation Incidents
One of the stories shared with us on our tour of Air Force One was an incident that highlighted the critical need for human oversight. During the 9/11 attacks, President Bush was on board AF1 in the air with all air traffic grounded across the U.S., (all non-responsive airborne planes were considered to be planes that had been highjacked), the pilot for AF1 was alerted that planes were approaching at “supersonic speed.” He had only seconds to decide what his actions would be. With a full knowledge of aviation and range capabilities of supersonic jets he calculated that these planes were friendly, he was rewarded with scrambled escort fighter jets from Houston (The President’s Guard) as escorts back into Washington, D.C. This pilot’s combination of autopilot disengagement and pilot evaluation led to preventing a catastrophic outcome of friendly fire, underscores the importance of skilled human intervention. Despite advanced autopilot systems, pilots must be prepared to manage unexpected situations and make rapid, informed decisions.
Health Care Scenarios
In health care, there have been instances where AI misdiagnoses or inappropriate treatment recommendations could have led to serious consequences if not for the intervention of health care professionals. The term I have coined is “The Fog of Care,” the critical time when decisions must be made in a hectic time-sensitive, life-or-death health care scenario. For example, AI systems have occasionally misinterpreted medical images or lab results, leading to incorrect diagnoses. Health care professionals are essential in reviewing AI outputs, ensuring accuracy and making the final decisions in patient care.
The Future
The future of both aviation and health care lies in enhancing the synergy between humans and AI. In aviation, this means developing more intuitive interfaces and better training programs to prepare pilots for integrating AI tools seamlessly into their workflows. In health care, it involves creating AI systems that are transparent, explainable and designed to complement rather than replace human expertise.
Ethical and regulatory considerations will continue to play a significant role in the development and deployment of AI technologies. Ensuring that AI systems are used responsibly, transparently and ethically is paramount. Regulatory bodies must establish guidelines and standards that balance innovation with safety and ethical considerations, protecting the interests of all stakeholders. Continuous improvement and learning are vital for both fields. AI systems must be regularly updated with new data and refined algorithms to remain effective and reliable. Additionally, ongoing training for pilots and health care professionals ensures they remain adept at using these technologies and prepared to intervene when necessary.
The advancements in autopilot systems in aviation and AI in health care represent significant technological progress. They offer enhanced efficiency, safety and outcomes. However, the indispensable role of human involvement cannot be overstated. Pilots and health care professionals bring critical thinking, ethical reasoning and emotional intelligence that AI systems cannot replicate. Ensuring the safety and effectiveness of these technologies requires a collaborative approach, where human expertise and AI capabilities complement each other to create a safer and more effective environment in aviation and health care.
Mark Watts is an experienced imaging professional who founded an AI company called Zenlike.ai.

