May 22 / Dana Wilson-Szucs & Rebecca Lougheed

Cognitive Biases in Aviation

The Hidden Influences Behind Pilot Decision-Making
Aviation is often described as a blend of science, discipline, and judgment. But even with the most advanced training and the latest technology, pilots are still human—and human thinking isn’t always rational. Cognitive biases, the mental shortcuts our brains make are like the autopilot of our thinking. They’re fast, efficient, and mostly helpful—until they’re not. In aviation, where high stakes and rapid decision-making are the norm, cognitive biases can quietly steer you off course without you even realizing it. So let’s unpack some of the most common ones —and how to keep them from becoming unwelcome passengers in your cockpit.

What Are Cognitive Biases—and Why Do They Matter in Aviation?

Think of cognitive biases as the brain’s lazy hacks. They help us process information efficiently and can even motivate us to more proactive behaviour in certain situations. In day-to-day life, these shortcuts can be harmless or even helpful, but in aviation, where clarity and precision matter deeply, the balance between efficiency and accuracy raises the risk of them contributing to poor decisions and unsafe outcomes.

From a psychological perspective, biases are influenced by bounded rationality—our brain’s need to make timely decisions without full information—and cognitive load, where high task demands reduce our ability to think critically. These mechanisms exist in all of us, regardless of experience level or skill.

Recognizing and managing these biases is not a matter of intelligence or competence. It’s a matter of awareness.

Six Common Biases That Affect Pilot Thinking

1. Confirmation Bias - “I Knew I Was Right” (Even When I Wasn't)

This bias causes us to focus on information that supports what we already believe while ignoring what contradicts it. Air France 447 is an example of this—the crew made an initial assessment and thought the aircraft was in an high speed (overspeed) descent condition. Another example is the 2007 Comair Flight 5191 out of KLEX/Lexington. You can read the full report here, but as a (brief) summary the crew departed from the wrong runway because what they saw matched 'enough' and they failed to recognise cues which should have alerted them to the error.

Psych tip:
Build in a personal “devil’s advocate” moment. Ask yourself: what would make me question this decision? Better yet, ask your crew.

Pilot tip: See above. It is really hard to spot this in yourself. A good way to avoid it is to ask questions that prompt the other pilot to assess the information independently before providing their answer or assessment. “We turned left at M1 so this is Runway 22, right?” is going to lead them into your bias. “Can you confirm which intersection that was and our position?” avoids it.

2. Expectation Bias – When You Hear What You Want to Hear

Expectation bias is like your brain’s version of predictive text—it fills in the blanks based on what it thinks should be there. In high workload environments, this can lead to critical communication errors—such as hearing an ATC clearance that wasn’t given or confirming a checklist item without actually checking.

This bias is particularly dangerous in fast-paced phases of flight like descent or approach, where small misinterpretations can have outsized consequences. In fact, it is a major contributor to serious events like runway incursions and incorrect runway incidents. Like the incident involving Southwest Airlines flight 708 and FedEx flight 1432 at KAUS/Austin in 2023. LVOs were in force, and ATC had cleared SWA 708 to takeoff, but made incorrect assumptions about their position on the runway due to expectation biases. It result was two aircraft separated by just 150’.

The Air Canada 759 incident at KSFO/San Francisco is another example. Cleared to land on runway 28R, and having missed a NOTAM telling them the parallel runway was closed, they expected to see two runways lit up. Instead they saw runway 28R and taxiway C and aimed for that. Expectation bias (with a whack of confirmation bias thrown in too). 

Psych tip: Readbacks aren’t just a formality. They’re cognitive checks, supporting clear thinking when our brains try to take shortcuts.

Pilot tip: Look out for this in others. I'm talking other aircraft responses, ATC clearances etc as well as co-pilots!

3. Continuation Bias – The “Just Push On” Trap

Also known as “get-there-itis,” continuation bias is the tendency to stick with a plan even when changing conditions suggest it’s no longer the safest option. It’s often reinforced by the sunk cost fallacy (a.k.a. “We’ve already committed!”) —the belief that having invested time and effort into a plan, we should see it through.

Pilots may find it difficult to divert or delay, even when new data (weather, fuel, systems) indicates a need to reassess. Recognizing this bias means reframing course corrections as sound decisions, not failures.

I love the story of the Nimrod R1 XW666 which ditched into the Moray Firth after they had an engine fire. Originally, they were aiming for RAF Lossiemouth but the Captain, prepared for a need to review this, was able to adapt and change the plan to ditch instead.

Another example occurred in 2019 at KNIP/Jacksonville Naval Air Station, involving a Miami Air 737. Long story short, the crew decided to land on runway 10 and despite changing weather conditions, failed to amend their plan. During the approach they became useable but still continued. The result was a runway excursion.

Psych tip: Create decision points in your pre-flight planning. That way, changing course isn’t “quitting”—it’s sticking to plan B.

Pilot tip: Always review! Keep asking “what has changed?” and “what have we missed?” so that you are prepared and are ready to adapt if you need to.

4. Overconfidence Bias - “I’ve Got This” (Maybe Too Much)

Overconfidence can affect decision-making by creating an illusion of control or skill beyond what the situation demands. It often develops with experience, especially in familiar settings, where routine can lead to complacency - “I’ve done this approach a hundred times.”, “I’ll catch it if something goes wrong.” 

This doesn’t mean experienced pilots aren’t safe—it means all pilots, regardless of hours flown, need mechanisms to check their assumptions and remain open to challenge, especially during critical situations. (Check out our previous article to read more about the topic!)

Psych tip: Confidence should come with curiosity. Stay humble. Stay suspicious—in a healthy, checklist-following kind of way.

Pilot tip: Complacency is one (very small step) away from over-confidence. Never stop questioning the conditions, threats and yourself!

5. Anchoring Bias – The Sticky First Impression

You get one piece of information early in the flight—say, a weather report or fuel estimate—and everything else gets measured against it. Even if new info contradicts it, your brain keeps circling back to that first anchor.

Anchoring can cause delays in adjusting the plan, particularly if that original information framed the entire mental model of the flight.

Psych tip: The first idea isn’t always the best one—give new inputs a fair chance to shift your thinking.

Pilot tip: We can actually all suffer from this with first impressions and this can impact how we work with other pilots. I had a captain once treat me the same way he had over 5 years ago, because he was “anchored” to my capability levels from when he first flew with me as a low hour pilot.

Bonus: Normalization of Deviance – “We’ve Always Done It This Way

Sometimes, a small shortcut doesn’t seem like a big deal—especially if nothing goes wrong. But when those small deviations start happening regularly, and no one blinks an eye, they slowly become part of the routine. That’s normalization of deviance: when unsafe practices feel acceptable simply because they haven’t led to immediate consequences.

Psych tip: SOPs aren’t just checkboxes—they’re guardrails built from real incidents. If something feels “safe enough” just because it’s worked before, it might be worth a second look. That’s often where hidden risks live.

Why These Biases Persist

Psychologically, biases persist because they reduce uncertainty and make decision-making feel easier. They’re reinforced by:

- Time pressure and stress: Narrow attention and urgency make it harder to think critically
- Fatigue: Depletes mental energy needed for reflective thinking
- Cockpit hierarchy or social dynamics: Less experienced crew members may not challenge a decision, even when they sense something is wrong

These factors highlight why bias is not just an individual issue—it’s a team and system-level concern.

So, What Can You Do About It?

First, let’s acknowledge: everyone is biased. It’s how human brains are wired. But just because biases are natural doesn’t mean they’re harmless.

Here’s what helps:

- Awareness: You can’t fix what you don’t notice.
- Metacognition: Taking a moment to reflect on how you're thinking—not just what you're thinking—can be a powerful way to catch biases before they shape your decisions.
- Crew Resource Management: Your colleagues are your best defense against your own blind spots.
- Scenario-Based Training: Simulate the uncomfortable, the surprising, the ambiguous. That’s where bias thrives—and where it can be rewired.

And yes, the irony is not lost on us—your brain, which is both the problem and the solution, will try to tell you this article doesn’t apply to you. (That’s a bias too.)

Some pilot tips

We often read about the theory on things like biases and think “urgh, theory!” or worse “Doesn't apply to me!” They do though and it is up to us to understand how. Reading the theory and understanding biases is step one. Step two is understanding them in relation to you, and applying what you learn. Reading things like accident and incident investigation reports can really help you ask “How might this affect me?”

We've all got a story or two of our own as well. Reflecting on these, and sharing them with colleagues, is a really good way to encourage more awareness of them. So here's mine:

I was relatively inexperienced, heading into an airport I'd never been to before, and I planned for an ILS only to be given a visual approach last minute which I wasn’t prepared for, and I never managed to stabilise. But I didn't go-around. Instead, I pushed on… and on… and landed off it. This was a prime example of continuation bias. I was only focused on landing, and on doing everything necessary to “get it in”. 

I've had many years now to review why I got myself into this situation, and how to recognise it occurring again. So what's you example? What have you learned? Where might you need to learn a bit more?

Final Thoughts: Your Brain Is Brilliant… and a Bit Sneaky

Cognitive biases aren’t about being a bad pilot. They’re about being a human one. In aviation, where the cost of error can be high, understanding these biases is a step toward safer, more adaptive decision-making.

Promoting a culture of awareness, open dialogue, and continuous learning can make bias a manageable part of the cockpit environment—not a hidden hazard. And in that process, we’re not just improving safety. We’re also building trust—in ourselves, in each other, and in the systems that carry us through the skies.
Our Advanced Interview Course teaches more than just "how to answer interview questions", it helps you understand you, and includes access to our 16PF questionnaire.