“Select channel A.”
A practicing physician looks at the smart IV infusion pump placed in front of her, confused.
“Select channel A,” another healthcare practitioner repeats, as she stares at the instructions printed on the sheet she is holding in front of her.
The other participants at the table also are gathered around the pump with quizzical looks on their faces.
“Is there anything else written there?” someone asks.
“No. I’m sorry,” says the physician. “It just says: select channel A. …”
These participants were among nearly 50 healthcare professionals who came to Madison from around the world in July 2015 to participate in the weeklong Systems Engineering Initiative for Patient Safety (SEIPS) short course on human factors engineering and patient safety organized by Pascale Carayon, the Procter & Gamble Bascom Professor in Total Quality in the Department of Industrial & Systems Engineering, and her team from the Center for Quality and Productivity Improvement (CQPI).
SEIPS is a multidisciplinary initiative applying human factors and systems engineering approaches to promote patient safety. First funded in 2001 by the Agency for Healthcare Research and Quality as the Developmental Center for Evaluation and Research for Patient Safety (DCERPS), SEIPS research empirically examines systems design, job design, and technology implementations that affect safety-related patient, organizational, and staff outcomes.
Patient safety surfaced as a central theme after the Institute of Medicine discussed the incidence and impact of medical errors in a major 1999 report. A more recent 2013 article by John James in the September issue of the Journal of Patient Safety showed that preventable medical errors continued to be the third-leading cause of death in the United States after heart disease and cancer, claiming the lives of almost 400,000 people each year. In light of this, Carayon’s work has become central to the discussion about how to design healthcare systems and processes to overcome this national and international problem.
Back in the test studio, the physician fumbles around the buttons looking for the correct one. Noticing that the letter A has appeared on the digital screen, she presses it but nothing happens.
“That’s right, it’s not a touchscreen!” she exclaims and excitedly jabs the keypad aligned next to the A. Still nothing.
Frustrated, she places the pump on its side in search of the letter A.
“We can’t just flip it on the side—imagine if we were in the hospital right now,” comments the pharmacist, who serves as manager of quality and medication safety services for a large international healthcare organization.
Finally, she locates the correct button, “channel select,” on an attached device opposite the pump programming unit. In the time it took to locate the button, a real-life patient waiting for an IV infusion to begin may have lost his or her trust in the device—and in the clinician programming the pump.
The activity brought to light for short-course participants how easy it is to make an error when using a medical device, even in a room full of experienced physicians, nurses, pharmacists, and other healthcare administrators and professionals. It was also reminiscent of a real case recently reported to the FDA in which a patient was admitted to the hospital for issues related to chronic liver problems, but died as a result of an error in programming the pump for a routine infusion.
Although just a simulation, the exercise clearly illustrated the reality that is today’s world of healthcare: a convoluted set of overlapping and confusing systems involving multiple people and forms of technology.
What appears to many of us as a simple task (such as programming an IV pump or entering a medication order in the electronic health record) practiced by thousands of healthcare professionals, every day, in the same way and on the same machine, is actually a confusing and complex process that carries many more risks than we would imagine.
“There are many different types of pumps—each one is different and many hospitals will have multiple types operating throughout the organization at the same time,” says Ann Hundt, CQPI’s associate director for education, who ran the simulation activity with the group.
To complicate things further, she explains that instructions are often difficult to understand and the technology sometimes includes optional overrides that are applicable to most of the patients, but not all of them. These challenges are coupled with the fact that hospital staff have varying degrees of training and experience.
So what can we do? In light of the walloping 1,000 estimated deaths a day in U.S. hospitals due to errors, change is imperative and could save millions of lives, and have significant positive financial consequences.
Carayon cautions that change comes slowly and in small pieces. For the smart pump, a good place to start would be reevaluating how the pump is introduced and fits in the clinical workflow. Additional changes can be introduced with more widespread dissemination and use of human factors and systems engineering expertise among health IT vendors and manufacturers and healthcare delivery organizations.
“People want to turn to technology for answers. They think that if they make a new pump it will be easier,” explains Carayon. “The result is a proliferation of different pump models, each with its own design strengths and weaknesses. Yet, well-engineered design system changes coupled with staff support and workflow integration could have a profound impact on the quality of the hospital, staff, and the patient’s well-being.”
That’s what one physician attendee is placing her bets on. She left the short course planning to share what she learned in the SEIPS course with her colleagues, and intends to speak with her peers about the processes used for technology design and implementation at her organization. “Reconvene us in 10 years and see how much things have changed,” she says.