Advanced Flight Decks: Engagement vs. automation
Blog post description.
Flying has always been about skill, situational awareness and human judgement. From the Wright brothers onwards, pilots have been “The Right Stuff” (if you haven’t seen the 1983 movie, it’s worth a look) of skill, determination and resilience. As technology advances — the autopilot, flight-management systems, even early versions of adaptive automation — the human pilot’s role is shifting. For many aviators, this shift brings not only new operational demands but also deeper psychological ones: What is my role now? How do I stay engaged? And what happens if the automation falters?
In this post we explore how automation affects mental workload, pilot identity, and the subtle forms of stress that can build when the cockpit becomes less hands-on.
The paradox of automation: less work, more cognitive stress
It sounds counter-intuitive: automation lowers physical or manual burden, yet research shows it often changes rather than reduces mental workload.
A 2025 scoping review of 90 articles found that as flight-deck automation increases, the cognitive requirements for the pilot change significantly — tasks shift from manual control to supervision, decision-making and monitoring.
Older but foundational research from the Federal Aviation Administration (FAA) showed that automation often shifts burden into monitoring mode — which can feel less engaged and more fatiguing.
A 2024 article noted that while automation can reduce workload, it carries risk of “loss of vigilance on primary instruments” and decreases the human’s sense of being actively in control.
In other words: your hands might not be moving as much, but your brain is still working — often in a different, less visible way. The challenge becomes staying engaged, maintaining situational awareness and keeping your manual‐skills sharp for when automation drops out.
There’s a psychological parallel here: recent developments in employee wellbeing research and training have solidly demonstrated that people who act as if “on autopilot” at work are less fulfilled, less effective and more likely to experience mental health issues.
Skill erosion, “out-of-the-loop” and identity shifts
One of the deeper effects of high automation is the phenomenon described as the out-of-the-loop (OOTL) performance problem: when human operators monitor systems so passively that their situational awareness and readiness to intervene degrades.
Key points:
Automation reduces the human’s direct input, creeping roles into supervisory or monitoring states rather than actively flying.
Over time, the pilot may feel less like “the one flying” and more like “the one watching.” That erodes the sense of professional identity many aviators hold.
Manual skill rust can follow. The famous case of Air France Flight 447 (2009) has been cited as an example of this mode-confusion risk.
That identity shift can lead to subtle stress: “If I’m not doing what I was trained for, am I still a pilot in the same sense?” That question matters, professionally and psychologically. Fortunately, modern approaches to pilot training emphasis the role of monitoring pilot from the earliest stages of training,
Monitoring mode: Mental workload may appear low — but risk may rise
When pilots feel “less busy,” they may be tempted toward complacency, mind-wandering or lower vigilance. Recent work supports this:
A 2025 field-study using EEG and machine-learning monitored pilots’ brain states in flight. They found high accuracy (≈ 86%) classifying workload states, and noted that both under-load (too little to do) and over-load (too much) correlated with degraded attention. This is something pilots are taught consistently. Almost every trainee will have been shown the Yerkes-Dodson stress vs arousal curve discussed how to stay at the most effective point of that.
The 2024 mental-workload study of pilots in different roles found that when functioning as pilot flying (PF) within a crew, perceived mental demand was significantly higher than as pilot monitoring (PM), but in single-pilot or high-automation contexts the monitoring role still had higher physical demand and risk of disengagement.
Research from 2012 showed that lower-level automation (which keeps the pilot more engaged) may increase workload, whereas high-level automation (which reduces pilot tasking) may reduce workload but increase risk of reduced situation awareness.
In simple terms: just because you’re “less busy” doesn’t mean you’re safe. The brain’s job shifts, and unless systems and culture adapt, pilots may face vigilance fatigue, or become unprepared for when the automation hands back control.
Automation and the changing shape of errors
Automation doesn’t eliminate human error; it changes the error types.
Mode errors, confusion about automation state, and expectation mis-match are well documented: Edgar-Sarter & Woods’ 1993 NASA report emphasised that “How in the world did I ever get into that mode?” reflects pilots’ struggle to understand automation transitions.
The shift means more latent errors (failures in monitoring, planning or decision-making) rather than overt manual mistakes.
Monitoring tasks require high vigilance in low-signal environments; that means mental fatigue and distraction become the risk rather than purely physical/tactical mistakes.
From a psychological viewpoint, this means pilot self-expectations may clash with their emergent role: “I’m supposed to fly, but I’m supervising.” That gap can trigger stress, doubt, and a sense of role dissonance.
Professional identity, meaning and wellbeing
Pilots often join the profession drawn by mastery, precision, and human-in-command. Automation changes that script:
Mastery shifts from stick/rudder skills to systems-management, decision overlay, exception handling. The sense of being “in command” may feel replaced by “overseeing systems” — which may feel less engaging, especially in long sectors where automation dominates. Over time, this can feed psychological themes of reduced professional efficacy (a key burnout criterion) and meaning-drift: “Is this still what I signed up for?”
Therapeutically, this is fertile terrain for reflection: values work, reframing competence in monitoring, and embracing the “human-in-loop” not just as back-up but as integral, adaptive partner. Modern approaches to professional pilot training often centre on a set of pilot competencies which challenge pilots to focus on more than just manual flying skills and automation management. Enhanced training and skill development in decision making, workload management, situational awareness and other areas has enabled me as a psychologist working with pilots to facilitate engagement. Professionalism is important to most of us. As pilots, skill and mastery are very often critically important values at the centre of our psychological wellbeing. Working with values that encompass the whole spectrum of modern pilot competencies is something I stringly believe enhances both mental health and performance at work.
Final approach: embracing the pilot of tomorrow
Automation is here to stay — and it brings huge operational benefits. But the human element remains crucial: how pilots engage, monitor, decide and intervene matters more than ever.
For pilots, the invitation is to broaden how they define “flying”: from stick and rudder to supervising, decision-making and maintaining readiness. For organisations, the challenge is to design automation that supports, not replaces, human engagement — and to recognise the psychological implications of this paradigm shift.
If you’ve ever caught yourself thinking, “I used to fly — now I watch,” you’re not alone. It may be your body and mind telling you it’s time to reconnect — to the job, to your purpose, and to the meaning beneath the controls. The resources on my website might point you at ways of considering your values and improving your engagement at work.
