From The Ergoweb® Learning Center

Medical Errors: Human or Design

Earlier this fall, Ergoweb reported on guidelines released to increase the safety and usability of medical devices, New Standard for Ergonomics in Medical Device Design, Nov. 28, 2001. In addition to looking at the design of devices that may contribute to medical errors, some researchers look at the human factors of the system and the people who work in it. Researchers believe that by first understanding and predicting human performance in the health care system, a more ‘ergonomic’ system can be designed that will reduce the number of medical errors, improving patient care and saving lives.

Behind some of this research and in part responsible for making sure that it is shared throughout the health care field is the Quality Interagency Coordination (QuIC). Established by former president Clinton, The goal of the QuIC is to ensure that all Federal agencies that purchase, provide, study, or regulate health care services are working in a coordinated way toward the common goal of improving the quality of care. The QuIC seeks to provide information to help people make choices, to improve the care purchased and delivered by the Government, and to develop the infrastructure needed to improve the health care system.

In September of this year, the QuIC sponsored Among those invited to give testimony were human factor/ergonomics specialists including David Woods, Past President of the Human Factors and Ergonomics Society (HFES).

Note: The following is excerpted testimony written by David Woods for the National Summit on Medical Errors and Patient Safety Research, September 11, 2001. For full testimony see http://www.quic.gov .

Throughout the patient safety movement, health care leaders have consistently referred to the potential value of the Human Factors research on human performance and system failure. The patient safety movement has been based on three ideas derived from results of research on human expertise, collaborative work, and high reliability organizations built up through investments by other industries:

  • Adopt a systems approach to understand how breakdowns can occur on how to support decisions in the increasingly complex worlds of health care.
  • Move beyond a culture blame to create open flow of information and learning about vulnerabilities to failure.
  • Build partnerships across all stakeholders in health care to set aside differences and to make progress on a common overarching goal.

Creating a durable, informative, and useful partnership between health care and those disciplines with core expertise in areas of human performance is critical to advancing patient safety.

The research base about human performance has been built up from studies of how practitioners interact to handle situations in many different contexts such as aviation, industrial process control, space operations. The results capture empirical regularities and provide explanatory concepts and models. These results allow us to go behind the unique features and surface variability of many different settings to see common underlying patterns.

Research has found that doing things safely, in the course of meeting other goals, is always part of operational practice. As people in their different roles are aware of potential paths to failure, they develop failure sensitive strategies to forestall these possibilities. When failures occurred against this background of usual success, researchers found multiple contributors each necessary but only jointly sufficient and a process of drift toward failure as planned defenses eroded in the face of production pressures and change. The research revealed systematic, predictable organizational factors at work, not simply erratic individuals. The research also showed that to understand episodes of failure one had to first understand usual success-how people in their various roles learn and adapt to create safety in a world fraught with hazards, tradeoffs, and multiple goals.

Virtually every Human Factors practitioner and researcher when examining the typical human interface of computer information systems and computerized devices in health care is appalled. What we take for granted as the least common denominator in user centered design and testing of computer systems in other high risk industries (and even in commercial software development houses that produce desktop educational and games software) seems to be far too rare in medical devices and computer systems. The devices are too complex and require too much training to use given typical workload pressures.

We have a window of opportunity for improving safety for patients, but there are many false trails that will consume the energy and resources available. To take advantage of this window we must be prepared to question conventional wisdom and assumptions by building a partnership between different health care specialties and different human performance specialties that intersect at the label human error. These human performance specialties are substantive, deep, and unfamiliar to health care. I have not been able to cover all of them in this piece. But they are the wellspring for techniques, concepts, and systems that will improve human performance in health care as has been the case in other high risk domains.

From the past work in Human Factors a simple standard emerges for judging success in research on error and safety. Research is successful to the degree that it helps recognize, anticipate, and defend against paths to failure that arise as organizations and technology change, before any patient is injured.