Submitted by Colin Henry, Chair AAMS Safety Committee
President, Colin Henry Aviation, Safety and Quality Consulting LLC

In 2008 the Air Medical industry experienced its worst year of accidents where we lost 29 of our colleagues. For many years we have experienced “spikes” in our accident frequency but nothing like what we saw in 2008. This risk management breakdown received the attention of Congress, Government Accountability Office, Federal Aviation Authority, industry, media, the public as well as the NTSB. As a result, the NTSB gathered some industry experts and held a hearing in 2009 to gather facts, data and perspectives to better understand the current state of the Air Medical industry.

After the NTSB hearing, more Air Medical operators moved towards implementing Safety Management Systems (SMS), risk assessment tools, night vision systems (NVG), helicopter terrain and warning systems (HTAWS). Some even implemented “soft skills” such as Human Factors Threat and Error Management and Just Culture in their organizations.

As mentioned above, the air medical industry has experienced some grave “spikes” in our industry’s accidents that started way before 2008 but never as horrific. It is possible that as humans we fall into the complacency “trap” where we are so very good at what we do that everything is almost automatic. Since my involvement with the Air Medical industry (1988), I can attest that I have trained, helped hire, and worked with many fine professional pilots (≈13) that have suffered in fatal Air Medical accidents. Complacency occurs when people feel satisfied, so it is reasonable to expect that complacency will affect the most experienced people more than most novices. The individual who is learning a new task is often too fearful to be satisfied or content with his or her performance or decision. Experience is the realm of the expert, but could be a hidden trap. Experience is desirable but comes at a price with a hidden trap waiting in ambush even for the most competent individual. As experience builds, an individual’s comfort level increases in everything they do. With increased experience and proficiency, comes comfort and opportunities to relax …. the result is complacency. Your attention drifts from the task at hand and your knowledge of what is happening and what is going to happen starts to wane. Complacency is one of the most misunderstood contributors to errors, mishaps, and accidents. The very nature of being complacent makes it difficult to recognize before it causes harm. By minimizing the elements that may cause complacency, harmful or even fatal mistakes could be avoided.

One way to stay ahead of complacency is to have a Safety Culture that requires accountability, especially peer to peer accountability. Accountability is one of the true indicators of a Safety Culture. It encourages a system level perspective and performance improvement. There are however some serious constraints to the development of accountability. We must first remember that humans do not naturally confess to mistakes. Some of us are introverted and some are extroverted. People must trust the system to make their information confidential and must be provided with feedback when incidents are reported. The personal part of accountability could have some blind spots in the form of hazardous attitudes and mental bias.

Some hazardous attitudes are:

  • Anti-authority – “No one tells me what to do!”
  • Impulsiveness – “I (we) need to do something about this now!”
  • Invulnerability – “It can’t happen to me!”
  • Too competitive/Macho – “That’s nothing. Watch this!”
  • Resignation – “Why bother, man? We’re doomed!”
  • Pressing too far – “We’re going to get this done come hell or high water!”
  • Oversized Ego/Vanity – “I’d rather die than look bad!”
  • Procrastination – “When I get the feeling to do something, I lie down until that feeling goes away.”

Your Safety Culture must also include a/an:

  • Informed Culture
  • Reporting Culture
  • Flexible Culture
  • Just Culture
  • Learning Culture

We must ask ourselves these questions! How is your SMS working for you? Can you truly see your SMS in your day-to-day operations? How are you continuously assessing its performance? Federal Aviation Regulations 135.607 requires all helicopter air ambulances to be equipped with Helicopter Flight Data Monitoring (HFDMS) after April22, 2018. How are you intending to use these wonderful devices in your organization? If de-identified information is shared effectively and a true sampling of the data is analyzed, we will have an opportunity to truly take our industry from reactive to proactive and possibly predictive in some cases. Effective Line Operations Safety Audits (LOSA) and FDMS can make a difference “where the rubber meets the road.”An SMS that is truly functioning and performing must have interdependency. Each SMS pillar depends on the other pillars to achieve maximum performance. For example, after you install your FDM system:

  • You must outline and write policy/procedures on how this piece of equipment will be used. (Safety Policy)
  • Train and communicate information on its use to your employees. (Safety Promotion)
  • Decide how and when you are going to access the data. (Safety Assurance)
  • We now get an opportunity to improve our training, operations or safety procedures. (Safety Risk Management)

These are just some examples that could be expanded as we look at our SMS development. Let’s eliminate the “spikes” and work towards trending downward as an industry. The Vision must be ZERO!