HPI Safety Summit: Communication Key Component of High Reliability Journey

The cab driver that took me to the Cincinnati airport as I left the HPI Safety Summit last week was from Ethiopia. He had made a point to say “east Africa” when I asked where he was from, as the Ebola virus had greater than average attention in the Queen City due to one of the two nurses who contracted the virus having recently passed through northern Ohio. Even though she never came any closer to Cincinnati than almost 250 miles away, almost every cab driver encountered during our stay mentioned Ebola. This gentleman, in broken but completely intelligible English, shared that he had been in Cincinnati for nine years, and was lamenting the fact that his accent remained far too apparent while his young children now spoke perfect English. Our conversation continued on the beauty of different cultures and their languages, and he told me 80 different languages are spoken in his small country of origin. A quick Google search confirmed this, as well as that anywhere from 1,500 to 3,000 different languages are spoken across the African continent. It dawned on me that the magnitude of such disparate means of communication might not only contribute to a lack of understanding, but with it, the slow-moving development experienced across Africa on the whole. He agreed, which led to an impassioned explanation on how the inability to communicate in a common language leads to a lack of trust among clans, violence and often the need to hire a translator just to travel from the north to the south of Ethiopia. I told him he had inspired an ETY post, as it is becoming increasingly clear that data and fact lose almost every time to fear, ignorance, poor communication and a good old-fashioned wives’ tale spun by a convincing storyteller.

Screen Shot 2014-11-11 at 6.21.23 AMCommunication was one of the overriding themes at the Safety Summit as well. HPI is endeavoring to make the language of patient safety universal across healthcare by providing consistent, process-driven training that gives healthcare professionals a vocabulary in high reliability, resilience and a systems approach to care. The number of partnerships HPI has formed with healthcare organizations across the US seeking to join the high reliability journey is growing, and with it, so follows the number of patient lives positively impacted by those employing their teachings at the frontlines of care. Their teaching excels in parallel with a clients’ ability to communicate the learning, and the session my colleague Erin Agelakopolous and I presented on the topic was standing room only. With newer clients in attendance at this year’s Safety Summit, there were many who wanted to understand how we were communicating the HPI learning across a health system of 30,000. We shared the tool kit designed by MedStar’s Communications team, our 60 Seconds for Safety videos, patient and provider stories, and a Good Catch program recognizing the excellent work at our frontlines while reinforcing the learning culture HROs need to thrive. And we shared that this has indeed been a journey—with our internal communications efforts growing in tandem with a collective comfort level in the new just culture tenets being increasingly embraced.

There were many excellent sessions at HPI’s Safety Summit. Of particular note was the keynote given by nationally recognized patient advocate, DePaul University Professor, Mom, MBA and former McKinsey consultant, Beth Daley Ullem. Beth emphasized the need for healthcare consumers to have access to data and information about the healthcare procedures they are purchasing. “We spend more time evaluating the purchase of mutual funds,” she said, “than heart surgeries.” Having lost a child to preventable medical harm directly related to the culture of medicine, Beth and her approach to this work, provided yet another inspirational reminder that we need a greater sense of urgency around the change we were all in Cincinnati to support.

The Children’s Hospital of Philadelphia shared their Good Catch program in a session. Cancer Treatment Center’s of America shared their HRO internal communication campaign and Safety Superheroes. Piedmont Healthcare shared how they are trying to communicate taking transparency to the next level. All expressed how important it is to find ways to communicate HPIs high reliability teachings and culture change across the health system. Being at the Summit was spending time with those already drinking the Kool-Aid of culture change. With all the social media and content development tools available to us, we now need to figure out how to take this excellent work along with the messages of just culture, transparency and open, honest communication in healthcare viral–

For more information on HPI and the Safety Summit, go to: www.hpiresults.com


An Addendum to Annie’s Story

Following is additional information from our team who helped share Annie’s Story, led by RJ (Terry) Fairbanks (@TerryFairbanks), MD MS, Director, National Center for Human Factors in Healthcare, MedStar Health, Tracy Granzyk (@tgranz), MS, Director, Patient Safety & Quality Innovation, MedStar Health, and Seth Krevat, MD, Assistant Vice President for Safety, MedStar Health.

We appreciate the tremendous interest in Annie’s story and wanted to respond to the numerous excellent comments that have come in over YouTube, blogs and email. The short five minute video sharing Annie’s story was intended to share just one piece of a much larger story–that is, the significant impact we can have on our caregivers and our safety culture when the traditional ‘shame and blame’ approach is used in the aftermath of an unintended patient harm event. At MedStar Health, we are undergoing a transformation in safety that embraces an all-encompassing systems science approach to all safety events. Our senior leaders across the system are all on board. But more importantly, we have nearly 30,000 associates we need to convince. Too often in the past, our Root Cause Analyses led to superficial conclusions that encouraged re-education, re-training, re-policy and remediation…efforts that have been shown to lack sustainability and will decay very shortly after implementation. We took the easy way out and our safety culture suffered for it.

Healthcare leaders like to believe we follow a systems approach, but in most cases we historically have not. We often fail to find the true contributing factors in adverse events and in hazards, but even when we do, we frequently employ solutions which, if viewed through a lens of safety science, are both ineffective or non-sustainable. Very often, events that are facilitated by numerous system hazards are classified as “nursing error” or “human error,” and closed with “counseling” or a staff inservice. By missing the opportunity to focus on the design of system and device factors, we may harm individuals personally and professionally, damage our safety cultures, and fail to find solutions that will prevent future harm. It was the wrongful damage to the individual healthcare provider that this video was intended to highlight.

In telling Annie’s story, we chose to focus on one main theme–the unnecessary and wrongful punishment of good caregivers when we fail to cultivate a systems inquiry approach to all unfortunate harm events. This is the true definition of a just culture…the balance between systems safety science and personal accountability of those that knowingly or recklessly violate safe policies or procedures for their own benefit. Blaming good caregivers without putting the competencies, time and resources into truly understanding all the issues in play that contributed to the outcome is taking the easy way out. We wanted our caregivers to know we are no longer taking the easy way out…

You will be happy to know that the patient fully recovered, that Annie is an amazing nurse and leader in our system, the hospital leaders apologized to her, and all glucometers within our system were changed to reflect clear messaging of blood glucose results. We believe we have eliminated the hazard that would have continued to exist if we had only focused on educating, counseling and discipline that centered around “be more careful” or “pay better attention”. We also communicated the issue directly to the manufacturer, and presented the full case in several venues, in an effort to ensure that this same event does not occur somewhere else.

This event, which occurred over three years ago, gave us the opportunity to improve care across all ten of our hospitals. It also highlighted the willingness of our healthcare providers to ask for help because they sensed something was not right and wanted to truly understand all the issues–they also wanted to find a true and sustaining solution to the problem using a different approach than what had been done in the past. Thanks to everyone for sharing your thoughts and for asking us to tell the rest of the story. We have updated the YouTube description as well.

And, thanks to Paul Levy for opening up this discussion on his blog, Not Running A Hospital, and to those of you who continue to share Annie’s story.

For those who have yet to see the video, here it is:


Taking the Easy Way Out When Errors Occur

Historically in healthcare, when an error occurred we focused on individual fault. It was the simplest and easiest way out for us to make sense of any breakdown in care – find the person or persons responsible for the error and punish them mostly through things like shame, suspension or remediation. Re-train, re-educate and re-policy were the standard outcomes that came out of any attempt at a root cause analysis.  Taking that route was easy because it didn’t require a lot of time, resources, skills or competencies to arrive at that conclusion especially for an industry that lacked an understanding, or appreciation of systems engineering and human factors.  High reliability organizations outside of healthcare think differently, and have taken a much different approach through the years because they appreciate that it is only by looking at the entire system, versus looking to place blame on the lone individual, that they can understand where weaknesses lie and true problems can be fixed. James Reason astutely said “We cannot change the human condition but we can change the conditions under which humans work”.

The following short video is about Annie, a nurse who courageously shares her own story…a story that highlights when we didn’t do it right, but subsequently learned how to do it better by embracing a systems approach that is built on a fair and just culture when errors occur.  A special thanks to Annie and to Terry Fairbanks MD MS, Director, National Center for Human Factors in Healthcare who helps us make sure our health system affords the time, resources, skills and competencies necessary to do it correctly.


Resilience Engineering: A Novel Approach to Keeping Patients Safe

Resilience ConferenceFollowing is information on another excellent educational opportunity offering a novel approach to keeping patients safe, coming up in June!

Terry Fairbanks MD, Director of the National Center for Human Factors in Healthcare and Neil Weissman MD, Director of the Health Research Institute, both at MedStar Health, are hosting a two-day, innovative and workshop-style conference to share knowledge, spark innovative ideas, and inspire new collaborations and partnerships to apply resilience engineering in healthcare.

The conference, entitled Ideas to Innovation: Stimulating Collaborations in the Application of Resilience Engineering to Healthcare,” will be held on June 13-14, 2013 at the Keck Center of the National Academies in Washington, DC. Additional information and registration can be found at http://www.resilienceengineeringhealthcare.com.

What is Resilience Engineering and how is it used in Healthcare?  Simply put, it is how individuals, teams and organizations monitor, adapt to, and act on failures in high-risk situations. In greater detail:

Resilience engineering is an emerging field of study that focuses on the fundamental systemic characteristics that enable safe and efficient performance in expected (and unexpected) conditions. It is a paradigm for safety in complex socio-technical systems, and its application to healthcare is very limited. During this two-day workshop, leading researchers and practitioners in resilience engineering and resilient health care will present a set of principles,  practices and desired outcomes and products.  After these presentations, attendees will be asked to initiate discussions that will lead  to the development of  an effective roadmap to catalyze the idea to innovation process – the ultimate goal is to help health care organizations and other interested parties  improve quality and safety.

“Ideas to Innovation: Simulating Collaborations in the Application of Resilience Engineering to Healthcare” is hosted by the MedStar Health Research Institute and the University-Industry Demonstration Partnership (UIDP) as the first conference in UIDP’s Ideas to Innovation series.


MedStar Health: Moving Patient Safety Into The 21st Century

Yesterday, I introduced the multi-disciplinary team at MedStar Health System led by Terry Fairbanks MD/MS, Director of the National Center for Human Factors Engineering in Healthcare and his team’s proposal for an Integrated Patient Safety Transformational (PST) Model to prevent and mitigate harm to patients. A description of the model, and a proposed plan for dissemination, follow. Comments are always welcome!

The team’s PST model is based on the clinical concept of primary, secondary and tertiary disease prevention–where interventions are first targeted at disease prevention, and then finally at mitigating disease if/when it occurs. The example the team provides is for Cardiovascular Disease Prevention:

  1. Primary prevention includes things like encouraging a healthy lifestyle or smoking cessation programs.
  2. Secondary prevention encompasses screening for risk factors and then controlling those risk factors.
  3. Tertiary prevention includes strategies employed post heart attack or optimizing congestive heart failure management.

The PST model for patient safety improvement takes a novel approach by shifting the typical focus of healthcare’s medical error management from “after the fact” to the primary and secondary areas of opportunity. The model is designed to proactively identify existing hazards and marginalize them before error occurs, and enhances the management of error if it does occur, through transparent disclosure when appropriate (tertiary prevention). These efforts will not only protect patients from preventable harm, but also have the potential to provide considerable cost savings to the health system.

Five “best practice” modules were selected to address these three areas of prevention with the ability to design additional modules if a need arises. Each module will be measured for its individual efficacy, as well as the efficacy of the model as a whole. The flow of the proposed PST model will include:

  1. Primary prevention: Module 1 (Proactive Risk Assessment), Module 2 (Enhanced Patient & Family Satisfaction), Module 3 (“Warm Handoff” including team dynamics and physician-patient communication strategies)
  2. Secondary prevention: Module 4 (Hazard Alerting Loop which reports hazards and collects, analyzes, trends, and feeds back hundreds of reports to staff)
  3. Tertiary prevention: Module 5 (Trains a “Go Team” to immediately address medical error through disclosure/apology/compensation (where appropriate), support staff involved and immediate initiation of systems-safety based event review)

The team is striving to apply the model within MedStar health operations, but is awaiting AHRQ’s review of their grant proposal which, if awarded, will allow an intensive implementation in the emergency medicine setting. Of importance to note, is that MedStar has a unique proving ground within their system. The five EDs where they hope to test the model reflect the diversity and breadth of the urban, suburban and rural areas in Maryland and the District of Columbia. Several of MedStar’s community hospitals, along with the larger teaching and tertiary care centers, serve diverse patient populations and present an opportunity to provide research that is inclusive of many patient populations.


MedStar Health: Using A Multi-Disciplinary Approach to Catch Medical Errors Before They Occur

Hundreds of near misses occur for each adverse event in complex industries, according to James Reason’s research, yet many health systems continue to try to solve the medical error problem by focusing on the proverbial needle in the haystack, which still occurs all too frequently. So frequently in fact, that the World Health Organization recently reported the odds of dying due to a medical error to be 1 in 300 compared to the likelihood of dying in an airline crash, which they estimate to be 1 in 10 million.

But the MedStar Health System in Maryland and Washington DC is increasing their odds of finding a new way to prevent medical errors before they occur. Leadership at this innovative health system is successfully collaborating across multi-disciplinary lines with safety scientists, innovation researchers, safety & quality leadership and their simulation center, SiTel, coming together to move patient safety into the 21st century. Instead of waiting for a medical error to occur, Terry Fairbanks MD, MS (Director), Zach Hettinger MD, MS and their team at the National Center for Human Factors of Engineering in Healthcare are taking into account the opportunity those hundreds of near misses present, and the team is looking to test their Integrated Patient Safety Transformational (PST) Model which provides a proactive approach to medical error prevention, catching errors before they occur.

If medical error should occur, the PST model also includes the same tenets of disclosure and transparency the 7 Pillars approach is based upon, and which received a $3M AHRQ grant in 2010. David Mayer MD, our blog host and now VP of Safety & Quality at MedStar, is co-Principal Investigator on the 7 Pillars grant, along with Principal Investigator Tim McDonald MD/JD at the University of Illinois-Chicago Medical Center. With Mayer’s arrival at MedStar in May of this year, it was apparent that Fairbanks’ human factors work raised the 7 pillars to new heights (and vice versa). The model was now complete by providing an organized and optimized advanced response to adverse events with an emphasis on improving safety and reducing liability risk before error occurs, but also having a progressive approach to work with patients, families and caregivers if events do occur.

A description of the PST model, and the origin of its genesis will be provided in tomorrow’s post. In the meantime, please share your innovative ideas around preventing medical error and keeping patients safe.