Please reload

Recent Posts

Planning for - and minimizing the impact of - undisclosed medical/psychological conditions on outdoor programs

January 10, 2020

1/1
Please reload

Featured Posts

What happens if we fail to learn from our near-misses? WITH NEW UPDATES

March 13, 2019

 

“The day soldiers stop bringing you their problems is the day you have stopped leading them. 

 

They have either lost confidence that you can help them or concluded that you do not care. 

 

Either case is a failure of leadership.” - Gen. Colin Powell 

 

 

At Experiential Consulting, LLC we have been focused on the importance of learning from near-misses for many years, and have helped clients integrate near-miss reporting into their organizational culture. We believe that sharing the learning from near-misses is the gateway for organizations to develop a culture of openness, feedback, problem solving, and continuous learning. Experts debate if the things that cause near-misses are the things that ultimately lead to catastrophes or fatalities, but in the outdoor programs we work with, we find that a near-miss can serve as an accident precursor, and that there is much to be gained by learning to talk about our near-misses. We have written and presented at conferences extensively about this concept before. 

 

Several recent events (in early 2019) lead us to revisit this topic today. The most newsworthy (and obvious) example can be found in the tragic crashes involving Boeing's 737 MAX, and the subsequent global grounding of those planes. As the news continues to come in, we see some themes here that are worth highlighting, including systems thinking, learning from near-misses, and ultimately, culture. 

What happened to the planes? To summarize the Boeing crashes, which have been correlated and connected to each other according to FAA Administrator Daniel Elwell, the pilots struggled to maintain control of the planes during takeoff, which may be attributed to a new technological feature on the planes called the Maneuvering Characteristics Augmentation System, or MCAS, a safety mechanism that automatically corrects for a plane entering a stall pattern. If the plane loses lift under its wings during takeoff and the nose begins to point too far upward, the MCAS kicks in and automatically forces the nose back down. If functioning correctly, this can help to prevent the plane from stalling (and eliminate the human error of taking off at too steep of an angle). In the case of the first crash, the MCAS kicked in and forced the nose of the plane abruptly down during takeoff at a critical and irrecoverable time. At the time of this blog being published, more and more evidence is coming in connecting the factors between the two crashes, though the investigations are ongoing. 

Systems thinking: It's easy to just say that the planes crashed due to operator (cockpit) error. Or we can back up another step and blame it on the training they did or didn't receive, or even on their plane's manual which has been called "criminally insufficient" by some pilots. If we keep going, we find a software problem which was discovered in the wake of the first crash in October, 2018 (Lion Airlines). This software issue was reportedly in the midst of being resolved between Boeing and the FAA when the United States government shut down for 35 days, stalling the resolution of that software fix. Backing up even further, the FAA has been led by an interim (acting) director for the past two years, as no permanent director has been successfully appointed.