When the System Breaks Down
By Mel Gedruj
After the first Persian Gulf war ended and the no fly zone was in effect, the U.S. Air Force was tasked with controlling the skies over Iraq.
By Mel Gedruj
This involved complex tasks requiring stringent measures of safety to prevent systems failure at any given point. To that effect AWACS, the most advanced Airborne Warning and Controls Systems in the world, were used. This would be the equivalent in land-based security to a C3 (Command and Control Centre). But, on a clear day in 1994, an accident occurred. Despite the most sophisticated systems ever used and the highly trained and experienced professional military personnel involved, two U.S. Black Hawk helicopters were shot down by two US F15s in a friendly fire. The choppers were carrying 26 people. None survived. A number of studies were carried out in the aftermath and even a book was published by the mother of one of the victims.
There were a number of safety controls that failed one after another. From the IFF (Identity Friend or Foe) validation to the radio frequency used down to the visual verification.
An article published by Dr. Nancy Leveson of MIT and Dr. Margaret-Ann Storey of University of Victoria, used a System Model of Accidents to try to understand what took place. The step by step study clearly showed the blind spots of the system regardless of the elaborate structure that was meant to prevent exactly this type of tragedy.
Another incident occurred in 2003 between a British Tornado aircraft and a U.S. Patriot missile battery. Again, the after the fact analysis identified the systemic failures and inadequacies that led to the event.
Another research of interest was led by James Reason, of the University of Manchester, U.K. In his paper “Achieving a Safety culture — theory and practice,” he demonstrates that an “unsafe” culture is mostly the result of organizational inadequacies rather than individual accidents. The primary reason would be that an organizational culture cannot be ready made. It is organic and evolves over time, therefore whether it goes in the positive safety culture direction or not will depend on the quality of leadership and the workforce. While accidents caused by an individual’s inadequacy may have limited consequences, organizational failures have often disastrous outcomes.
Another case is the collapse of the then world’s second oldest merchant bank, Barings Bank. One aspect common to all these situations seems to revolve around a gradual development of an unsafe culture with a mix of ingredients ranging from incompetent or unconcerned leadership, untested procedures, unreported small failures until the big one hits. As we can see this is not confined to one industry or sector and should be viewed as a risk management problem of the highest order. It does seem to affect high-tech systems where complexity cannot easily be comprehended by one individual, requiring a layering of safety controls. James Reason was the co-author of an accident causation model analysis called the “Swiss Cheese Model.” It features an easily understood visual representation whereby, as long as the holes in the consecutive slices of cheese do not align, there will be barriers. Accidents occur when all holes align. It reflects adequately all the instances discussed here.
To build strong safety cultures: allow reporting (even whistleblowing), provide strong leadership and induce a social engineering process to make awareness paramount and negligence a thing of the past. Avoid rule violation punishment, a sure recipe for concealment and encourage positive compliance. Ultimately, reducing the impact of individual human errors requires a well designed and tested organization safety model.
To quote James Reason: “If eternal vigilance is the price of liberty, then chronic unease is the price of safety.”
Mel Gedruj, OAA, CSPM is the president of V2PM Inc., specialized in municipal security management planning.