Wednesday, December 18, 2013

Patient Safety and the Man-Machine Interface

We know that humans make errors and we also know that we can eliminate some of those errors through the appropriate use of technology. For example, the computerized order entry system (CPOE) has just about eliminated medication errors due to misinterpreted handwriting or the use of dangerous abbreviations.  
However, machines don’t fix everything and computerization creates new opportunities for error at what is known as the man-machine interface.

An example of a catastrophe created at this interface was the crash of American Airlines Flight 965. The aircraft was a Boeing 757, and it was on a scheduled flight from Miami International Airport to Cali, Colombia, when it crashed into a mountain in Buga, Colombia on December 20, 1995, killing 151 passengers and 8 crew members.

Cali's approach uses several radio beacons to guide pilots around the mountains and canyons that surround the city. The airplane's flight management system already had these beacons programmed in, and could have told the pilots exactly where to turn, climb, and descend, all the way from Miami to the terminal in Cali. Essentially, once the pilots had programmed the computer, the plane could have taken off and landed itself successfully.

Cali's controllers asked the pilots if they wanted to fly a straight-in approach to runway 19 rather than coming around to runway 01. The pilots agreed, hoping to make up some time. The pilots then erroneously cleared the approach waypoints from their navigation computer. When the controller asked the pilots to check back in over Tuluá, north of Cali, it was no longer programmed into the computer, and so they had to pull out their maps to find it.

By the time they found Tuluá's coordinates, they had already passed over it. In response to this, they attempted to program the navigation computer for the next approach waypoint, Rozo. However, Rozo was identified as R on their charts. Colombia had duplicated the identifier for the Romeo waypoint near Bogotá, and the computer's list of stored waypoints did not include the Rozo waypoint as "R," but only under its full name "ROZO." In cases where a country allowed duplicate identifiers, it often listed them with the largest city first. By picking the first "R" from the list, the captain caused the autopilot to start flying a course to Bogotá, resulting in the airplane turning east in a wide semicircle. By the time the error was detected, the aircraft was in a valley running roughly north-south parallel to the one they should have been in. The pilots had put the aircraft on a collision course with a 3,000-meter (9,800 feet) mountain. They realized their error too late and the plane crashed into the mountain. A system designed to make flight safer had been misused by the humans flying the plane and this had resulted in 159 deaths.

Recently, at GBMC, we had a near miss from a human error at the man machine interface. An ED doctor was trying to order a CT scan of the head and neck on a patient and inadvertently clicked on the next name in the list and ordered the CT on the wrong patient. In the old system, the doctor would have taken an order sheet and stamped the patient’s name on it, but in the computerized world, this type of new opportunity for error presented itself.

The wrong patient did not get the scan because our design for safety has a check that worked that day. Jana Sanders, the C.T. technician who was working reviewed the patient’s record and questioned the orders because there was no mention of a fall or head/neck pain. Jana called the physician to make sure he wanted these exams on the patient before doing the scans. At that point, the physician realized the error, thanked Jana for catching it, and put the order in for the correct patient.
 
It is said that in highly reliable systems, operators have a preoccupation with failure. Operators, like Jana, have a questioning attitude because they know that humans make errors, so they follow the design for safety and do the check to make sure they have the right patient. They also realize that computerization prevents many types of errors but creates some new ones at the man-machine interface. Our hats are off to Jana on a job well done!

What opportunities for error do you see in your work at the man-machine interface?


No comments:

Post a Comment

Thank you for taking time to read "A Healthy Dialogue" and for commenting on the blog. Comments are an important part of the public dialogue and help facilitate conversation. All comments are reviewed before posting to ensure posts are not off-topic, do not violate patient confidentiality, and are civil. Differing opinions are welcome as long as the tone is respectful.