Most people think “learning from experience” is easy, or natural, but neither is true in a high-consequence environment. It takes serious effort and a disciplined awareness to learn well when safety is a critical concern. Some “successes” may be accepted as valuable – but only after careful analysis. Many other “successes” should be rejected as only “luck;” or “one-off winners.” What succeeds as an expedient should never immediately be accepted as a Standard Operating Procedure (SOP). But humans are optimizers by nature. And unfortunately, any action that produces satisfactory results, is often immediately incorporated (untested) into our mental playbook of “acceptable actions.” This mental process often occurs subconsciously and is called “normalizing.” It happens automatically with the brain chemicals that reward “success.”
Just are not because an action “succeeds” does not mean it should be accepted as a “Standard Operating Procedure (SOP) Every piloting action must be carefully examined and reflected upon; utility is often at odds with safety. When safety is a primary consideration – in aviation with the high cost of failure – “success” alone cannot validate procedures. Many unsafe actions and techniques do not immediately reveal themselves as dangerous! And these untested “actions that succeed” can easily get coded into our brains as “acceptable” by a process called “normalizing.“ It is very easy to become incrementally blind to the risk (drift). These actions may be quite hazardous but have not yet reached the “tipping point” of catastrophic failure. If we are honest, we must admit that “plain dumb luck” often protects us from catastrophe. But as the saying goes, “luck is not a reliable planning technique.” Honest reflection is an acquired piloting skill and essential to determining safe procedures we want to retain.
When we normalize (mentally accept) untested and harmful behavior without “scoring” its veracity, we are often “drifting into failure.” An unfortunate, and very public example, of “the smartest people on earth” drifting into failure was NASA and the Space Shuttle accidents. Empowered by the amazing success of their moon landing, NASA exhibited overconfidence and classic “normalizing” behavior by aggressively launching shuttles way beyond recommended risk parameters. Their luck ran out (twice) and the result was two dramatic and painful accidents. Viewed honestly, “success” (from luck) continually normalizes unacceptable risks. “Success” actually becomes an *impediment* to honest learning by reinforcing behavior and techniques that are dangerous to future safety. The current backcountry flying craze is following this same pattern. Risky flying that succeeds becomes “normalized” (and even celebrated on YouTube) and this continues to push the acceptable edge. Challenges taken too far inevitably “drift into failure.”
Judging any action by its results alone is a bad strategy (but this is how we commonly proceed when guided by emotional satisfaction). An example the Stanford Strategic Decision-Making Group presents in training makes this clear:
At a party, you have a few too many drinks and wisely decide to call an Uber to get home instead of driving; good or bad decision? Unfortunately, your car gets T-boned on the way home resulting in multiple injuries. If we test this decision just by “results,” it is obviously rated “bad.” Conversely, suppose you decide to drive home instead and successfully make it without damage, is this a vindication of this behavior? You can see “drift” in action here…
In many cases Murphy’s Law allows us succeed even with we exercise pretty bad judgment; we get away with it. And without careful reflection, bad decisions can easily become SOPs; we see this every day. Our brain rewards “success” and reinforces these behaviors.
Even smart, well-educated people can struggle to learn from experience. We all know someone who’s been at the office for 20 years and claims to have 20 years of experience, but they really have one year repeated 20 times. Double Loop Learning
To enable honest learning from experience, we need to reflect on every serious action or procedure and score its validity and value; this takes time, effort, and honesty. We need to honestly grade each new technique to verify that it conforms to acceptable industry norms. Also, does it reliably and repeatedly create the predicted and desirable outcome? Are the costs greater than the received value (every action has some risks)? Serious reflection and analysis are the required tools for learning from experience, and it does not happen “naturally!” Reflective analysis is a skill every pilot must learn and practice for safety. The military version of this same process is an after-action report (learning opportunity)!
Real progress and improvement (learning and not just problem-solving) occurs at a higher level and involves tweaking the mental models and preventing the error in the first place. This requires time to reflect critically on our own behavior and failings, solving deeper thinking/scripting problems. Level two or “double loop” learning freely admits to errors and fixes our inner OS that is usually the root cause.
Fly safely (and often) and see you at SAFE booth B-2097/8 at Oshkosh . Our SAFE Member Dinner is at the EAA PRC Thursday evening, July 27th. We need your help at the booth (so I can escape and present at the forums)!
Enjoy the new courses available to members on the new safe website. And please download and use the (free) SAFE Toolkit App. This contains all the references a working CFI needs plus provides continuously new safety content.
SAFE developed an insurance program just for CFIs! When you are an independent CFI, you are a business (and have legal exposure). This program is the most reasonable but also comprehensive insurance plan you can have (and every agent is a pilot!)
It’s a great introduction to a huge topic. How can we teach it? How do we recognize in our own flying-where we were fortunate? It’s not always obvious. You have to learn enough to be able to ask the question.
The Gibbs model is a great tool, though.
I think most of the answer is an honest assessment after every flight (instead of the self-congratulatory chest beating).
I confess to some flights where it was not super-skill but rather a little bit of mercy that got me through. We need humility and honesty (and yes, that is very difficult for pilots!)
It actually is not that difficult to do it right. The problem is, while not difficult, it is tedious and who likes to do stuff that is tedious?
I am going to admit something here that is almost certainly not generally acceptable in our conservative aviation environment: I am mostly a self-taught pilot. Oh, I had really good initial initial training. My father was a US Navy Fighter and test pilot. He taught me things I really didn’t appreciate until MUCH later. For instance, he taught me how to self-check-out in an airplane. Of the 105 different aircraft in my logbook, I did self-check-out in maybe 2/3 of them. The key is to collect as much information as possible, make initial flights as conservatively as possible, then expand the envelope a little bit at a time, all the while documenting what I have learned.
There is a very interesting thing that comes from this approach to learning various aircraft: you don’t have to un-learn the misunderstanding promulgated by people who accept as gospel the word of others who CLAIM to be knowledgeable. Rarely will the airplane lie to you, something that humans can’t claim.
In order for this to work you must be diligent. You much go out and formally test your understanding. When you truly understand something that understanding will be predictive, i.e., “I am fairly sure this is how this airplane will behave in this particular situation. If it does, my idea MAY be right. If it does not my idea is DEFINITELY wrong.” Welcome to Scientific Method.
Lest someone misunderstand, I DEFINITELY seek information from people who appear to know more than I do. However, what I don’t do is accept the information as correct at face-value. I craft a test to determine if the information I have been given is correct. If not, I discard it. If it seems so, I will craft a couple more tests to see if the information holds up in multiple scenarios and configurations. If after all that it seems to be correct, then and only then does it get passed on to others.
I recently had an example of this. I get a fair number of people who come to me to expand their personal operating envelope. Many are low-to-medium time private-pilots who bought or built high-performance aerobatic aircraft, e.g. an RV-14. In this case, while helping the pilot expand his personal operating envelope he confided that he had recently watched some videos on “Defined Minimum Maneuvering Airspeed”. He applied what he had seen and, frankly, his landings sucked (his words). No surprise there. After some ground school on the Lift Formula, understanding energy, calibrating the AoA indicator in his airplane, and then practicing landings using AoA vs. various airspeeds, he (like me) came to the conclusion that “Defined Minimum Maneuvering Airspeed” is a bunch of crap.
Now he understands why his airplane does what it does and his flying is better as a result. It is far too easy to watch something on YouTube, try it, if it seems OK, accept it as gospel. Doing the necessary diligence to really and truly confirm what one has seen is TEDIOUS. However, failing to apply diligence to the process leads to Normalization of Deviance. Real understanding requires both Scientific Method AND Scientific Skepticism.