I recently spoke with Diane Miller, VP & Executive Director of the Virginia Mason Institute about what it takes to change the culture of a healthcare organization in order to reach high reliability. She spoke of leadership and of teamwork–but she also mentioned courage. If, on any given day, 1 of the 30 leaders at Virginia Mason Medical Center (VMMC) needs a supportive boost to overcome an obstacle challenging their hospital pointed true north in the patient’s best interest, the well-established just culture at VMMC ensures that one of the other 29 leaders will pick that person up–the team never losing stride. They have honed a collective courage, and have one another’s back, as the saying goes. The journey then continues, all now stronger for having overcome the challenge together.
It’s well documented in the literature, that the existing culture of a healthcare organization is an indicator of the challenges that will lie ahead on the high reliability journey. According to James Reason, existence of a just culture is one of four subcultures it takes to ensure an informed culture, a safe culture–one that maintains “…states of wariness…to collect and disseminate information about incidents, near misses and the state of a system’s vital signs” (Weick & Sutcliffe’s Managing the Unexpected 2007). According to Reason, the sophistication of error reporting that contributes to an informed culture requires a level of trust throughout an organization that overcomes controversy–that is courageous. The three subcultures Reason believes an informed culture also includes are:
- Reporting culture—what gets reported when people make errors or experience near misses
- Flexible culture—how readily people can adapt to sudden and radical increments in pressure, pacing, and intensity
- Learning culture—how adequately people can convert the lessons that they have learned into reconfigurations of assumptions, frameworks, and action
Patrick Hudson, a psychologist who has done extensive and notable work in the oil and gas industry, and has published a number of papers on high risk industries, shares a model of cultural maturity in his article, Applying the lessons of high risk industries to health care (Qual Saf Health Care 2003 (12):i7-i12). His adapted model shows a safety culture to be evolutionary, moving up from:
- Pathological: safety is a problem caused by workers. Main drivers are the business and the desire not to get caught.
- Reactive: safety is taken seriously only after harm occurs.
- Calculative/Beaurocratic: safety is imposed by management systems, data is collected but workers are not on board.
- Proactive: workforce starts to embrace the need for improvement and problems found are fixed.
- Generative: there is active participation at all levels of the organization; safety is inherent to the business and chronic unease (wariness) is present.
According to Hudson, organizations with an advanced safety culture possess the following qualities, he refers to as dimensions:
- It is informed
- It exhibits trust at all levels of the organization
- It is adaptable to change
- It worries
Moving an organization through the cultural evolutionary continuum, to an advanced safety culture is not an overnight task. In fact, during our conversation, Diane also reminded me that the journey to achieve VMMC’s level of success took time. Time and courage in a day and age where quick fixes and status quo no longer work, but are challenging to change. But both time and courage will be necessary to address the culture change needed in medicine, and recently called for in the Institute of Medicine (IOM) report on September 6th. VMMC was mentioned during the IOM webcast covering the report, as an example of the bold leadership it took to get to where they are today, and where others will soon need to follow. (More information on the recent IOM Report can be found here).
A review on the reasons why, and the methodology behind, incident reporting continues. We welcome your thoughts, and reference summaries as well.
Dr. R Lawton from the School of Psychology at the University of Leeds and colleague looked to better understand “the willingness of healthcare professionals (doctors, nurses, and midwives) to report colleagues to a superior member of staff following an adverse incident or near miss.” They also explored “the difference in reporting of events involving three kinds of behavior defined by (James Reason)–compliance with a protocol, violation of a protocol, an improvisation where no protocol exists.” Lawton theorized that the culture of medicine, along with the increasing fear of litigation, would likely constrain healthcare providers from reporting.
And that is almost exactly what he found, as results showed:
- Doctors were less likely to report a colleague across the board, even when a colleague deliberately went around protocols.
- Nurses were the group most likely to report if there was a bad outcome for the patient.
- Protocol violations were reported most frequently, regardless of outcome.
The authors speculate that doctors’ unwillingness to report violations of protocol equal to their nursing or midwife colleagues may be a reflection of the perception that protocols “by many in the medical community (are viewed) as a threat to their professional autonomy”, and that doctors are reluctant to report a colleague as a reflection of “a professional culture in which what may be seen as whistle blowing is taboo.” Either way, the authors conclude that culture change within the NHS may first have to occur in order for incident reporting to deliver its true benefits.
The introduction of Evans and colleagues paper on incident reporting gives an excellent overview of the benefits these reports provide:
Incident reporting captures more contextual information about incidents and, when actively promoted within the clinical setting, it can detect more preventable adverse events than medical record review at a fraction of the cost. Near misses are rarely documented in medical records, yet occur more frequently than adverse events and provide valuable lessons in recovery mechanisms without the detrimental consequences of an adverse event. The subjective data provided by incident reporting enable hypothesis building and preventative strategies to be developed and tested. (See article for references).
An anonymous survey modified from Vincent et al J Eval Clin Pract 1999 was given to participants (186 doctors, 587 nurses both with >70% response rate) asking the following:
- Do you know if your hospital has an incident reporting system?
- If yes, do you know what form to use to submit a report?
- If yes, do you know how to access the reporting system?
- If yes, do you know how to submit a report?
- How often do you report 11 iatrogenic injuries (listed in Figure 2), and how often should these injuries be reported?
- Nineteen reasons as barriers to reporting were evaluated using a likert scale (listed in Table 2)
Results indicated that:
- Doctors and nurses were equally aware of an incident reporting system at their institutions, but nurses were significantly more likely to have filed a report (89.2% v 64.6% p<0.001). This may have to do with the fact that nurses also knew how to locate, and what to do with, a report to a significantly greater degree.
- Senior doctors were significantly less likely than their younger colleagues to have submitted a report.
- Both doctors and nurses completed reports most often for falls and least often for pressure sores.
Perceived barriers to reporting for doctors were: 1) Lack of feedback 2) Form took too long to complete and 3) Incident was perceived as too trivial. Barriers for nurses were: 1) Lack of feedback 2) Belief that there was no point in reporting near misses and 3) Forgetting to report when the ward was busy.
Of note in this study was that a poor reporting culture had less to do with the cultural environment and more to do with the functionality of the reporting system. Authors did note however, that the “poor reporting practices by doctors…probably reflects the prevailing deeply entrenched belief in medicine that only bad doctors make mistakes.” Authors conclude by highlighting the importance of sharing with staff the changes that are implemented as a result of the incidents reported.
Ravi Mahajan from the Division of Anaesthesia and Intensive Care at Queen’s Medical Centre in Nottingham, UK reviews how high reliability organizations, such as aviation and the rail industry, have been using incident reporting as a learning tool for improvement for some time yet that same well documented tool has not caught on in healthcare. According to Mahajan, the main reason to report incidents to improve patient safety, which is also well documented:
…is the belief that safety can improved by learning from incidents and near misses, rather than pretending that they have not happened.(5)
He states that leadership, larger governing bodies of healthcare and consumers are calling for the information incident reports provide in order to “better understand error and their contributing factors.” Mahajan highlights the World Health Organization (WHO) as having outlined guidelines for implementing effective reporting systems. Additional information on the WHO reporting guidelines can be found here.
What Mahajan also includes in his review is the need for a human factors approach to the analysis of medical errors, which considers the human component within the larger context of the health system. Instead of taking the quick and easy summation of an error as “someone’s fault”, a human factors approach takes into consideration all the events leading up to the error through a much larger lens, considering Reason’s “active and latent failures” mentioned earlier. All of this, he says, incorporated into the framework for analyzing critical incidents suggested by Vincent et al Br J Med 1998; (316) which takes into consideration the socio-technical pyramid discussed by Hurst and Ratcliffe and adapted to the clinical setting provides a structured approach for a meaningful root cause analysis of the error. The framework can found in Table 1 of Mahajan’s review.
Without meaningful feedback, however, Mahajan and others continue to point out the reports and the analysis are meaningless.
As we’ve mentioned, the road to high reliability starts with the formation of a just culture that supports the reporting of unsafe conditions, near misses and adverse events, in order to uncover those conditions within a system that make it prone to harm. It’s a simple statement–one that makes intuitive sense–so why then, has a reporting culture evaded medicine? The following authors weigh in on the how, what and why of incident reporting to show that any related growing pains are well worth the struggle in the best interest of our patients.
Please share references and information that will help raise our collective knowledge, and provide a road map for others seeking to build a reporting culture en route to high reliability. Our patients are depending on us to take this journey–
In this 2002 paper, Lucian Leape MD reiterates the recommendation of the Institute of Medicine’s To Err Is Human report, calling for the then controversial expansion of reporting around serious adverse events and medical errors. He also highlights that in order to stop the frequency of harm befalling patients, a greater understanding of the harm and its causes is needed “for the development of more effective methods of prevention (as) it seems evident that improved reporting of accidents and serious errors that do not cause harm (“close calls”) must be an essential part of any strategy to reduce injuries.”
Lucian describes the primary purpose of reporting these events is to learn from them, and the only way to learn is to first be aware the problem exists. Additional reasons to developing a robust internal reporting system according to Lucian include:
- Allows for monitoring of progress
- Allows lessons to be shared so others can avoid similar mishaps
- Holds everyone accountable
Table 2 in his report lists the characteristics of a successful reporting system along with an explanation. In brief, those characteristics are: 1) Non-punitive 2) Confidential 3) Independent 4) Expert analysis 5) Timely 6) Systems-oriented 7) Responsive
James Reason has been mentioned more than one time on this blog because of the focus we have on becoming a high reliability organization. Reason’s work in just culture and his in-depth research examining a person versus system’s approach to understanding medical error reinforces the need for a reporting culture in order to achieve high reliability. Reason writes:
Effective risk management depends crucially on establishing a reporting culture.(3) Without a detailed analysis of mishaps, incidents, near misses, and “free lessons,” we have no way of uncovering recurrent error traps or of knowing where the “edge” is until we fall over it…
…Trust is a key element of a reporting culture and this, in turn, requires the existence of a just culture–one possessing a collective understanding of where the line should be drawn between blameless and blameworthy actions.(5)
Reason’s explanation of a just culture is one in which error reporting is handled in a non-punitive manner, looking to understand active failures and latent conditions within a systems context. However, he recognizes that within the system, each individual remains accountable for their actions. In a high reliability organization, every individual is reminded of the value of incident reporting as the focus is put upon intentionally looking for anything that could result in harm.
Charles Vincent raises the point that incident reporting is only as effective as the measurement and patient safety programs that result from gathering the reports. As many agree, one of the reasons physicians give for failing to report is that having taken the time and emotional energy to do so, the report then sits without response or action. Vincent editorializes that:
…a functioning reporting system should no longer be equated with meaningful patient safety activity. Organisations must move towards active measurement and improvement programmes on a scale commensurate with the human and economic costs of unsafe, poor quality care.
The follow up on each report is reinforcement for the next incident to be reported. And it must be meaningful, productive feedback that rewards those who take the time and stick their neck out to share information.