How and Why We Fail (Understanding Dumb Mistakes)!

N121JM wreckage, aerial photograph, from NTSB Accident Docket ERA14MA271

How many times have you mentally kicked yourself for doing something incredibly stupid? And upon reflection, you can’t even figure out *why* you performed this way?  This “all too human” process is fascinating and built into our human operating system. Examining and understanding the little daily goofs and lapses can improve your life, but more importantly, improve your aviation safety so the big, bad stuff can (hopefully) be avoided.

To unpack the psychological process at play here, we first need to understand that most of our activities in life are only semi-consciously decided and most often automatically enabled. Our human operating system copes with the daily overload of sensory data, decisions and actions by utilizing a series of scripts or schema largely out of sight and called “implicit.” This is how we can famously drive to our destination in a car and not remember anything about the trip (or that we intended to stop at the store). We can also type 180 words per minute but are usually unable to label the qwerty keys without tapping them out on a table. This “implicit knowledge” and associated “scripts and schema” are internal and invisible. They also are not even filtered or examined by conscious oversight; operating in the shadows. (This “dual-process” brain theory was popularized in Daniel Kahneman’s book “Thinking Fast and Slow“). We continuously process and act in an automatic mode unless we have time and motivation to engage in effortful reflection. This natural optimizing has great survival value from efficiency, but in complex activities, it fails to manage risk and can occasionally wreck airplanes.

Let’s examine a typically bad decision to drive home, despite being pretty buzzed, after a party. Imagine we arrive home just fine despite our incapacitation. Achieving this “success”, we experience a feeling of relief and accomplishment. Now consider the opposite example of having a totally sober designated driver taking us home but getting T-boned and injured. We have in the first case a bad decision resulted in a good outcome, but in the second case a good decision resulted in a bad outcome. And though decisions and outcomes actually stand alone, that is not how our human brain interprets these situations. The first decision is reinforced due to “success” and the second one is discounted as “bad luck”. Letting this “implicit learning” into our subconscious as a standard of operation can have serious consequences for our future safety. Let’s see how our brains subconsciously “code” these experiences.

In the case where our decision was bad but the outcome was positive, our “success” psychologically reinforces our decision with relief and a feeling of accomplishment; e.g. “That turned out OK.” Though defective, this decision and validating success delivers a warm buzz of dopamine that neurologically encodes in our brains as “acceptable”. Unless we consciously reflect later on this action, to critique and correct this mental coding, a habit can easily develop and become an implicitly learned bad procedure. Clearly, luck was the primary operative factor in both cases, not skill. As humans, however, our “fast processor” tends to evaluate all decisions solely on the basis of the outcome rather than applying an objective standard or evaluating the quality of the decision. And once imported, an implicitly accepted standard of “only one drink” can easily slide to “only two drinks” in the same manner; and you see where this ends up going. A more thoughtful approach, guided by objective reflection, (carefully decided) is the antidote to this implicit learning and the basis for “Standard Operating Procedures” in charter and airline flying. Implicit procedures are self-optimizing and haphazard and slide to the lowest level of acceptability based on luck. And we all, unfortunately, know pilots that fly in this manner.

An aviation example of this process at work might start with a successful outcome despite marginal weather; perhaps arriving into a Delta airspace safely under a 1200 ft ceiling (even though our personal minimums were “1500 for VFR in Delta”).  Unless we later reflect and critique this dubious “success” we now have an implicitly accepted “new normal”. Our optimizing human brain codes “that worked out OK” and this new standard becomes part of our pilot operating system, out of sight and never adjudicated by our “better pilot self”. Pretty soon we find ourselves with new and sketchy standards and we might not even know why or how these standards were established. These implicitly learned, automatic schema are imported “under the radar” and are accepted as operational–just like bad code embedded in computer programs. And similarly, they may operate fine for a while until they fail suddenly and surprisingly. This neurological process is responsible for “normalizing deviance” that led to NASA’s safety issues with the Space Shuttle. On a “group think” level these same implicit procedures were working so we “go with it” (the only standard was “success”). Many of these errors are not big and obvious but insidiously erode standards on every flight. And this is also how thousands of hours and increased experience can work against us by building complacency rather than excellence.

To successfully combat the implicit learning of questionable procedures, an “after-flight critique and reflection” is essential for safety. Otherwise, our optimizing human brain is always at work creating implicit shortcuts–efficiency over safety. It is vital for future safety to always schedule a sacred time for personal analysis after every flight ; do it when you log the flight? This reflection process is essential to analyze what went right and wrong–and most importantly “why?” This is also the reason why written personal minimums (or professional SOPs) are necessary to keep a pilot honest and safe. Sliding standards, often implicitly learned, seem to always precede the crunch of aluminum. Fly safely out there (and often)!

See “SAFE SOCIAL WALL” FOr more Resources

Join SAFE and get great benefits. You get 1/3 off ForeFlight and your membership supports our mission of increasing aviation safety by promoting excellence in education.  Our FREE SAFE Toolkit App puts required pilot endorsements and experience requirements right on your smartphone and facilitates CFI+DPE teamwork. Our newly reformulated Mentoring Program is open to every CFI (and those working on the rating) Join our new Mentoring FaceBook Group.

Author: David St. George

SAFE Director, Master CFI (12X), FAA DPE, ATP (ME/SE) Currently jet charter captain.

9 thoughts on “How and Why We Fail (Understanding Dumb Mistakes)!”

  1. Wow David! Thanks for this great explanation on how the brain does its thing, for better or for worse. And there are many examples of accidents with bad outcomes that show the unhappy results of the syndrome you describe. I recall a fellow who had a fatal accident with passengers on board in his Bonanza. He was not instrument rated but had filed IFR and lost control in IMC. The investigation revealed that he had received about 20 hours of instrument training years before, but never passed a check ride. According to all his friends he had been filing and flying IFR ever since. Another pilot local to South Texas with tens of thousands of hours, died while flying a GPS approach when the nearest weather reporting was showing 0/0 conditions. And yet another pilot died with his front seat passenger, the backseat passengers surviving with minimal injuries, because they were not wearing seatbelts or shoulder harnesses.

    The common thead in these and many more examples; the behaviors that killed these pilots were not first time transgressions. These guys had done the things that killed them many times before. Their previous successes apparently “taught” them that their behaviors were perfectly safe and had reliably achievable and safe outcomes. This is why we have approach minimums. It’s not that going to the forbidden side of the island will result in instant death, rather that breaking this taboo will sooner or later bring one face to face with the monster that lives there.

    1. Thanks Charlie. This issue obviously is also tied up with “compliance” (trust) which is the flywheel for our whole aviation system (when is the last time you had to show your pilot cert?) Recent data indicates 25% of fatal accident victims were flying without a valid medical and 40% were on illicit or impairing drugs….scary! http://bit.ly/Pilot-Meds

  2. I am just entering in the aviation field as a future private pilot. I found this to be a great read on a topic I practice in my transportation carreer currently. I’m finding more similarities between flight and what I do on a daily basis for my risk management pracitices while driving my truck. This may sound corny but, I feel that becoming a pilot and all of the studies I am beginning are also reawakening risk management skills in all areas of my life.
    Thank you for this article.

    1. Thanks Doug. All high-stakes human activities share similar risks and techniques…welcome aboard and best of luck in flight (we can help!)

Tell us what *you* think!

This site uses Akismet to reduce spam. Learn how your comment data is processed.