From The Ergoweb® Learning Center

What is Cognitive Ergonomics?

Peter Budnick and Rachel Michael, 11th June, 2001

Editor’s Note: This is a revised version of Cognitive Ergonomics and Engineering Psychology which appeared in Ergonomics Today(TM)on June 11, 2001.

Ergonomics is sometimes described as “fitting the system to the human,” meaning that through informed decisions; equipment, tools, environments and tasks can be selected and designed to fit unique human abilities and limitations. Typical examples in the “physical ergonomics” arena include designing a lifting job to occur at or near waist height, selecting a tool shape that reduces awkward postures, and reducing unnecessary tasks and movements to increase production or reduce errors and waste. “Cognitive ergonomics,” on the other hand, focuses on the fit between human cognitive abilities and limitations and the machine, task, environment, etc. Example cognitive ergonomics applications include designing a software interface to be “easy to use,” designing a sign so that the majority of people will understand and act in the intended manner, designing an airplane cockpit or nuclear power plant control system so that the operators will not make catastrophic errors.

Cognitive ergonomics is especially important in the design of complex, high-tech, or automated systems. A poorly designed cellular phone user-interface may not cause an accident, but it may well cause great frustration on the part of the consumer and result in a marketplace driven business failure. A poor interface design on industrial automated equipment, though, may result in decreased production and quality, or even a life threatening accident.

Complex automated systems create interesting design challenges, and research and post accident analysis indicate that the human role in automated systems must be closely considered. Automation can result in increased operator monitoring and vigilance requirements, complex decision-making requirements, and other issues that can increase the likelihood of errors and accidents.

Another interesting effect in automation is that humans will sometimes over-trust or mistrust an automated system.

The Three Mile Island nuclear power plant accident is in part an example of the effect of people over-trusting a system. During that event, the control panel indicated that an important valve had operated as instructed, and the control room operators trusted the system was reporting accurately. Actually, the valve had not operated as instructed, and it became a key point in the failure that resulted in a serious mishap. (Interestingly, some will blame the operators, when in fact, under the mental load created by the evolving accident, they performed as an ergonomist would expect. The actual cause of the accident is a control system design error that provided incorrect information to the operators).

An example of mistrusting a system occurred at a medium security women’s prison in Oregon, USA, when a new surveillance system was installed. The alarm was triggered whenever it sensed motion in particular areas of the facility. During the first few weeks, the alarm was repeatedly triggered by everything from birds to leaves blowing in the wind. The guards became conditioned to the fact that it often triggered in error, and began to ignore it. Using this to her advantage, a prisoner climbed over the fences knowing that the alarm would go off, but that the guards would most likely ignore it long enough for her to escape. It worked. When this same mistrust effect occurs with something as important as a fire alarm, the results can be deadly.

Physical ergonomics issues, primarily in the workplace, dominate the public view and understanding of ergonomics. Fortunately, ergonomists are busy behind the scenes working to improve all human-machine interfaces, including the cognitive aspects. Unfortunately, many companies, engineers, regulators, and other decision makers fail to recognize the human factor in design, and many unnecessary errors, accidents, product failures and other business costs are the predictable result.