“A Miracle” on the Hudson: A Strong Case For Just Culture In Healthcare

At the National Patient Safety Foundation Congress in May of this year, Jeff Skiles, the First Officer of US Airways flight 1549 and Dr. Terry Fairbanks, emergency physician, safety science expert and Director of the National Center for Human Factors Engineering in Healthcare at MedStar Health in Washington DC, combined forces to explain how a systems approach to complex, high risk work environments can save lives. The presentation is a “must have” resource for anyone working to improve the safety of their healthcare system.

As we have mentioned on ETY in a number of posts, establishing a just culture in healthcare is key to becoming a high reliability organization with a track record of safety similar to that of the airline industry (see A Just Culture Takes Courage and Time, Culture of Respect In Medicine…). Tenets of a just culture ensure that:

  1. Every individual within an organization is not only treated fairly by colleagues, but also takes ownership for their own behavior–every day they come to work.
  2. Teamwork is a must–the education and experience of the group versus simply the most tenured individual is called upon regularly to keep the system running smoothly.
  3. Reporting of mistakes, near misses and unsafe condition is a given, and is done so openly and with enthusiasm for the next opportunity to prevent real harm from occurring.

Without a just culture and systems approach in place, the “Miracle on the Hudson” could have had a much different outcome. As Skiles speaks, it is apparent he believes less in the ‘miracle theory’ of successfully landing an Airbus 320 with two dead engines in the Hudson River, and more in the systems approach to training and teamwork he and his first-time cockpit teammate, Captain “Sully” Sullenberger, adhered to throughout their careers.

While leading healthcare organizations have been successful in cultivating a just culture, many more still have considerable distance to travel before wholeheartedly embracing the idea. According to Skiles, accidents are rarely attributable to just one cause–like James Reason’s swiss cheese model–accidents are usually the result of a series of smaller factors that have been overlooked or ignored, coming together at the same time. Factors that, if they had been reported and attended to, could have stopped a safety event from occurring. The culture of an organization sets the tone for what is attended to–what is overlooked and what is deemed a priority.

Fairbanks, a pilot as well as a physician, acknowledges the differences between the airline industry and healthcare, but also highlights some of the two systems’ similarities allowing for the application of human factors and safety science to improve the processes in healthcare that will lead to safer systems. The differences discussed in his presentation, however, seem to have less to do with the day-to-day complexities of each industry, and more to do with the degree of willingness each has to accept humans are fallible.

Jeff Skiles, NPSF Presentation, May 2012

The airline industry openly acknowledges that “to err is human”. For a long time, many in healthcare have denied this given, even in light of the 1999 seminal work by the same name. In this NPSF presentation, Fairbanks reminds the audience that everyone makes mistakes, however in healthcare “we have learned…we can’t make mistakes”. This unrealistic expectation needs to change–and quickly–as the number of errors occurring in healthcare continues to move in the wrong direction.

In aviation, The Error Tracking Theory presented by Skiles at NPSF outlines that, for the airline industry, it is:

1) Vastly more important to identify the hazards and threats to safety, than to identify and punish an individual for a mistake.

2) More important to exchange the ability to reprimand an individual for the ability to gain greater knowledge.

A second major difference between healthcare and aviation lies in the reporting of unsafe conditions, near misses and accidents. While aviation reports every event, or almost-event, to one governing body, healthcare still struggles with a collective understanding of what to report, and having a just culture in place that supports reporting. According to Fairbanks, 600 near misses provide a warning before 1 safety event occurs. Getting at this information proactively, and being able to intervene before harm actually occurs would be invaluable. If this invaluable information is never shared, the opportunity to prevent harm is missed altogether.

The airline industry has evolved over time to incorporate the tenets of a reporting and just culture, and has succeeded in becoming a high reliability industry. The results? The last commercial airline accident with fatalities occurred in November of 2001. For US Airlines, the last fatal accident occurred in 1992. Fear of flying becomes more and more irrational with results like these. Fairbanks believes healthcare can achieve similar results, but in order to do so, a major shift in the lens through which providers choose to view their own fallibility will need to occur on a more universal scale.

Advertisements


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s