Tuesday, January 21, 2014

I Like Swiss Cheese…But Not in My Work

I remember being a 22-year old in Italy and not worldly-wise. I was trying to buy “Swiss cheese” and was asking a friend to help me find it in a supermarket and translated it literally. My friend pointed out to me that Switzerland was known for its cheeses and there were many kinds of “Swiss cheese.” I explained to him that it was the one “with the holes in it,” and he then told me that the cheese was called Emmental.  

Many years later, when I was learning about safety, human error, and complex systems, I got introduced to the “Swiss Cheese Model of Error,” the work of Professor James Reason of Manchester University, in England. Professor Reason created the model to help his students understand the relationship between the thoughts and actions of humans in a complex system and how these thoughts and actions can either contribute to the creation of bad outcomes and in some cases, catastrophes, or help prevent them.

Reason knew that humans understand at some level that they are fallible, so that they build protective walls to “block” errors from resulting in bad outcomes. An example of this is in the system for protecting patients from medication ordering errors.  Physicians can and do occasionally make errors of ordering the wrong drug, the wrong dose, or the wrong route. The first wall of defense to block one of these errors is called pharmacist verification. Every time a doctor orders a medication on a patient in the hospital, the pharmacist checks it to make sure that it is safe for the patient. But pharmacists also occasionally make mistakes. So the pharmacy verification wall has holes in it, too. The next wall of defense is called nurse check. Nurses also make mistakes so this wall is also imperfect. If you line up all of the walls what does it look like? You guessed it…Swiss cheese.


                                                                                                                                                                   
Reason calls the holes in the cheese latent failures. Latent means hidden or present in an unexpressed form. These latent failures can be grouped into a number of categories. One of the categories is failures at the managerial level. This occurs when a manager knows that a design is not being followed that could cause a bad outcome but the manager allows the design to not be followed. An example would be if a manager knew that the doctors and nurses were not doing the timeout correctly before a procedure but let the procedure go on anyway.

A second type of latent failure is called a psychological precursor. Psychological precursors can be thought of as beliefs held by those involved in a complex system that lead to people not following the design. An example of this would be if doctors and nurses did not do the timeout properly because they believed the timeout was “stupid” and “not necessary and a waste of time.” A psychological precursor that is rampant in healthcare is: “I do it whatever way I have to, to get whatever my patient needs.” This psychological precursor is the end result of clinicians working in broken systems. They come to believe that they don’t have to do it the way it is designed because the design never works anyway. This psychological precursor is the opposite of what people in other high risk industries like commercial aviation or nuclear power believe: “I follow the design because it is not safe to do otherwise.”

Other holes in the walls may be due to less pervasive problems or one- time events. These latent errors are called: local triggers, intrinsic defects, or atypical conditions. An example of an atypical condition is when a clinician is dealing with a patient that does not speak his or her language. We know that the opportunity for error increases when our patients can’t participate fully in their care or in protecting themselves from harm.

Bad outcomes may start with an unsafe act where an individual does not follow a procedure as designed or does not follow the standard work.  

Let me tell a story about an employee injury at GBMC using the Swiss cheese model. 

Our colleague falls and injures her arm.
An employee took an office grade bag of trash that had a lot of liquid in it out of a break room garbage can. The employee placed it on the floor in the hallway where it leaked onto the floor. (The designed procedure is to place the bag immediately in a cart.) The employee then realized that the bag had leaked, and retrieved a mop and cleaned it up. The employee did not put a wet floor sign down and a nurse turned the corner and slipped, falling to the floor and injuring her wrist.

So applying the model we see that one latent error in our system is that we use trash bags that are not designed for liquid in trash cans that may receive liquid. Although it is not absolutely clear it appears that at least someone may have the psychological precursor of “any trash bag will do” in using the bag that is not designed for liquid.  A second psychological precursor may have been, “I don’t need to follow the rule, it won’t leak, I’ll just put it on the floor.” The actual unsafe act was putting the bag on the floor. A last unsafe act was not putting the wet floor sign up.



Notice that if any one of these latent errors had been “fixed,” the hole would have been blocked and our colleague would not have been injured. If the correct bag was used, or if the person emptying the trash had put it directly in a cart or if he or she had put up the wet floor sign, our colleague would not have been injured. But, if only one had been fixed, the other latent errors would still have been present waiting to align for the next bad outcome or catastrophe.

We cannot wait until the holes in the Swiss cheese align to create a catastrophe; we must fix the holes when we find them. This is what our near miss (Quantros) reporting system helps us to do.

What latent errors have you found and are working to fix in our healthcare system?

No comments:

Post a Comment

Thank you for taking time to read "A Healthy Dialogue" and for commenting on the blog. Comments are an important part of the public dialogue and help facilitate conversation. All comments are reviewed before posting to ensure posts are not off-topic, do not violate patient confidentiality, and are civil. Differing opinions are welcome as long as the tone is respectful.