Transparency in Healthcare: NYTimes Invitation to a Dialogue Continued

Paul Levy extended an open invitation to healthcare colleagues on the New York Times Opinion page in a letter to the editor entitled, Invitation to a Dialogue: When Doctors Slip Up. Here is an excerpt from that letter:

The tendency to assign blame when mistakes occur is inimical to an environment in which we hope learning and improvement will take place. But there is some need to hold people accountable for egregious errors. Where’s the balance?…People in the medical field are well-intentioned and feel great distress when they harm patients. Let’s reserve punishment for clear cases of negligence. Other errors should be used to reinforce a learning environment in which we are hard on the problems rather than hard on the people.

This has been a continued struggle, both spoken and unspoken, for years. As Paul points out, well-intentioned and hard-working physicians, nurses, pharmacists and other care team members come to work each and every day trying to help heal those in need. However, our current health care system fails them (and our patients) when they come to work. The system also fails to create a learning environment where well-intended caregivers can share potential areas of weakness or events because of fear for their own careers. Because the existing culture of medicine has been very slow to change, I have always believed that educating the young was a silver lining of sorts–or a way to rebuild our culture from the ground up. Educational content on open/honest communication with patients and colleagues has been the core curriculum at the Telluride Patient Safety Student & Resident Summer Camps for the last four years, and that has been shared with over 300 resident physicians, and medical, nursing, pharmacy and law student alumni. It was with great pleasure that I read Telluride alum, Stephanie Wappel’s following response to Paul’s NYTimes piece this weekend – one of the few selected from many responses (see additional comments, including MedStar’s Human Factors Engineering Director, Dr. Terry Fairbanks, & myself here):

I was fortunate to attend a conference on patient safety for which Mr. Levy was a faculty leader. I agree that we need to change the culture regarding the disclosure of medical errors. We cannot learn from what we do not know, and what we do not know can seriously harm our patients.

One strategy that has been implemented at my home institution is the celebration of “good catches.” Every Monday, all hospital employees receive an e-mail that features the “good catch” of the week, in which an error was detected and reported before it had the potential to cause harm. Any hospital employee can report these good catches.

They range from a nurse’s realizing she received the wrong dose of medication from the pharmacy to a medical student’s stopping her patient from getting a procedure that the physician thought he had canceled in the new electronic ordering system. Obviously, the institution is also working on discovering how that error occurred to prevent similar ones.

It is no easy task to change a culture, but this seems to be a good start.

STEPHANIE WAPPEL
Washington, Oct. 16, 2013
The writer is a resident physician at Georgetown University Hospital

The Good Catch Program Stephanie mentions has been a concerted effort of our Patient Safety team at MedStar Health to share some of the learning opportunities that arise on a day-to-day basis in healthcare, celebrate those for their courage to report them, and face them head on versus hiding from them.

If health care is to achieve safety successes seen in other high-risk industries such as aviation, we must learn to balance safety and accountability. For caregivers who knowingly and recklessly violate safe practice, discipline is the right course and much needed. But most errors that lead to patient harm occur because of bad systems or processes, not bad people. Until we can be open and honest about our mistakes, learn from them and support our well-intentioned colleagues, we will continue to struggle.


Using Healthcare Human Factors to Keep Patients Safe

We can’t change the human condition but we can change the conditions under which humans work.
James Reason

I had the chance to attend a mini-course on the Science of Safety at IHI’s 24th Annual Forum last week, led by Don Berwick and others. I have heard him give this talk before but it is a good message…plus Don can speak on hand soap and totally engage his audience while making the talk educational. His focus was on how and what we can learn by adapting human factors engineering principles in our healthcare work. Don referred to James Reason’s quote above, and focused his presentation on 5 Human Factors Engineering lessons healthcare has to adopt within our culture:

  1. Avoid reliance on memory
  2. Simplify
  3. Standardize
  4. Use constraints and forcing functions
  5. Use protocols and checklists

Human factors engineering expertise is being invited into the safety and quality conversation more and more today.  Similar to that of patients and families, this set of eyes and knowledge has also been lacking from discussions that lead to meaningful change in our care systems. Large integrated health systems, like MedStar Health, are taking that next step and are making major investments in human factors engineering in their quest to make care safer for their patients at any cost. Terry Fairbanks and his team at the National Center for Human Factors Engineering in Healthcare represents this new model. The only large center of it kind in human factors engineering in the United States, Terry’s team is available to help redesign our systems (and others) in the best interest of patient safety, as well as design safer and more efficient systems altogether.

What is human factors engineering? A simple explanation from Terry’s website describe it as:

…an interdisciplinary approach to evaluating and improving the safety, efficiency, and robustness of work systems, such as healthcare delivery. Human Factors scientists and engineers study the intersection of people, technology, policy, and work across multiple domains, using an interdisciplinary approach that draws from cognitive psychology, organizational psychology, human performance, industrial engineering, systems engineering, and economic theory.

As Don Berwick emphasized in his talk, a human factors approach puts science into the safety conversation–providing us new ways to look at old problems. Albert Einstein warned us ‘we can’t solve problems by using the same kind of thinking we used when we created them’ and Terry’s team introduces new and different ways of thinking about the problems in our healthcare systems that continue to put patients at risk.

On March 11-12, 2013, Terry’s team will be the official host of the Human Factors and Ergonomics Society (HFES) conference in Baltimore.  This will not only help the HFES draw healthcare providers and administrators into their work, but also allow attendees to better understand how human factors applied to healthcare creates a safer healthcare environment for both patients and providers. It is a “must attend” conference for those looking at taking quality and safety learning to the next level.