“Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in “upstream” systemic factors…”
Dr. James Reason
Human Error: Models and Management
2000
The airline financial staff was balancing the resources to be allotted to this operation with the net income they were obtaining, and they figured out it was not worth it! So, that was the day the airline decided an intervention should be done right away to halt more losses and even the initial investment. Meanwhile, the airline was in the process of hiring more operational staff for the new fleet. They were pretty aware there were not enough people for that type rate in terms of qualifications and experience, but they could push it until getting what they were looking for. The POI and PMI were convinced to be flexible with the mitigation defenses as established and submitted though such officers knew the airline put such defenses just as evidence records and nothing else whatsoever. Whereas the unexperienced staff was hired to fill the manpower gaps, the airline also should’ve taken into account the sim sessions were passed with minimum academic performance by the pilots, but who was born fully aware of how to handle a new equipment they asked.
And the routes needed to fly in this season of the year. They were marginal-weather-based routes. They could not avoid making such decisions if they wanted to be competent and profitable. So, what went wrong was the question the airline had to answer at this investigation board meeting.
Well, how in the earth could the ailine officers imagine this crew considered raising the nose when the stall horn is activated though it began its operation before the set value just as an alarm for the pilots? And that was the cause of this accident?
Believe me guys this accident began way long before the pilots raised the nose when hearing the horn. This crash began the day the balance failed to be balanced, the day the airline decided to fill the staff gap with unexperienced people, the day the inspectors were convinced with fake defenses, the day they lowered the academic hurdles for the sim sessions, the day this air carrier did not do any management of change when designing the new routes to be flown, and finally the day that they never selected a day for a meeting to discuss putting all of these failures in the hazard register.
The probable cause of this accident would be probably written, included and submitted as such in the Aircraft Accident Report, but what they were facing was actual consequences of all the management failures / decisions the airline made by ignoring things, features and aspects we all know about human error, safety culture, active failures and latent conditions.
The question is then when these accidents are happening? Air crashes or ground incidents or accidents happen when active failures (stemming from organizational latent conditions or culture) are capable of breaking through all (four) defense layers of James Reason’s Model; i.e., organizational influences, unsafe supervision, preconditions for unsafe acts, and the unsafe acts themselves.
Why are such active failures so natural to happen and no one was able to foresee what it was about to happen? This occurred because of the insidious nature of the latent conditions that formed a plaster (culture) under which a set of “concealed” events (without obvious or evident negative outcomes for safety) occurred without anyone noticing or properly react to it.
The logical next question would be located at why anyone was not able to notice or react to the latent conditions. The answer is simple and straight. No one was able to do that since no monitoring actions regarding latent conditions were made.
Why is that? Why aiming at or choosing the preventive actions? A more responsive effectiveness is perceived, because it is specifically tailored to the magnitude, size and style of the active failure, but at the end of the day, these preventive actions that do not aim at the core or culture that generated such active failures are more expensive and with restricted effectiveness in time until the latent conditions are capable of breaking the defense layers again with a new and reordered set of unsafety.