From The Ergoweb® Learning Center

New Book Probes the Machine

Cognitive scientist Don Norman, Ph.D., explores the frontiers of technology in his coming book, The Design of Future Things.” The conquest of this territory depends in part on mastering some of our present issues with automation. He suggests that it might help if designer’s made believe they were computers trying to interact with irascible, illogical people. But don’t think he is yet another technologist trying to turn people into machines. On the contrary, he thinks it’s time designers of machines worked harder to understand and empathize with everyday people: thinking like a machine is his clever strategy to get there.

Dr. Norman is a co-director of a combined business administration and engineering degree program at Northwestern University. It gives students both an MBA and a master’s in Engineering Management degree. In this program he teaches managers and engineers about design and operations, the invisible part of a product or service that actually yields the desired flawless operation. He holds the Breed Chair of Design, and is in the Computer Science department, as well as a professor of Psychology and Cognitive Science. Dr. Norman is also co-founder and principal of the Nielsen Norman Group, a consultancy in California that specializes in human-centered products and services. The professor’s doctorate from the University of Pennsylvania is in Psychology, but it caps a master’s degree from the same school and a bachelor’s degree from MIT, both in electrical engineering. He has been honored in these and several other fields, an indication of the range of expertise useful to a recognized visionary.

An Uneasy Relationship

In his coming book, and 14 others he has authored or co-authored, the professor probes man’s often uneasy relationship with technology. The significant number of foreign translations of his top books suggests the world is eager for ideas on bringing it under control. 


A recent book,
Emotional Design: Why we love (or hate) everyday things,” gives aesthetics and pride of ownership more important roles in human-centered design. “The book pops with fresh paradigms, applying scientific rigor to our romance with the inanimate,” said Wired magazine in January 2004. “You’ll never see housewares the same way again." 

When situations exceed the capabilities of the automatic equipment, Dr. Norman explained in a 1990 paper, the inadequate feedback leads to difficulties for the human controllers. “Problem of Automation: Inappropriate feedback and interaction, not over-automation” appeared initially in the volume, “Human Factors in Hazardous Situations,” which was edited by D. E. Broadbent, A. Baddeley & J. T. Reason and published by Oxford University Press. The professor regards the 1990 observation as just as relevant to human-centered design today as it was when it was written. 

A Map to the Future

It’s a theme that appears again in the coming book, to be published by Basic Books, probably in November 2007. The chapters map his path through the issues: Cautious Cars and Cantankerous Kitchens; How Machines Take Control; Servants of our Machines; The Psychology of People & Machines; The Role of Automation; Natural Interaction; Six Rules for the Design of Smart Things; The Future of Everyday Things; and, The Machine’s Point of View.

Discussing the book in an interview with The Ergonomics Report™, Professor Norman began by pointing out that well-known, present-day problems of automation in the workplace, such as warning signals that disrupt the ability to deal with difficulties as they occur, have now moved into everyday life. They are occurring in our homes and automobiles, he said, where they represent a different issue. “We’re talking about the average citizen. They aren’t trained, they won’t be able to cope in the same way that professionals do.” He added that even professionals accustomed to dealing at work with automation problems can find them more difficult in a different setting, like the home. “In some sense, [that’s] the focus of the book.”

“Mainly everyday citizens,” he responded when asked about the intended audience,  but he added that he also wants it read by decision makers. He describes them as the business people who have authority, but no technical background in ergonomics. “We have just instituted a program at Northwestern University to train business people,” he said. People in the MBA program “undergo training in design and human factors issues so that as they become managers they will become much more appreciative of these issues.”

He pointed out that there is a significant difference between coping in a professional setting and in the home. “In the emergency room of a hospital, there are many people highly trained, and the nurses’ and anesthesiologist’s job is to monitor the patient. So even though the emergency signals and alarms are incredibly badly done, we have very competent professionals whose job is to monitor, to take action and to inform the rest of the people what is happening. In aviation and in commercial plants, we have the same situation – professionals who are well-trained, who understand the signals.”

And time is on their side, he added. In aviation, if everything goes wrong in the plane there can still be one to two minutes to respond and avert a crash. “What is new about the automobile is that we have the same types of warning signals and buzzes and flashes and alerts, but untrained drivers, who are distracted, [and might] … have to respond within one second.” This is a very different situation than has been studied in the past, he noted, and it’s “one that frightens me.”

The professor observed that the designers of the automation in automobiles have not learned the proper lessons of industry, “[They] don’t seem fully aware of the important distinction: one – untrained drivers; two, the lack of time.”

He regards partial automation as dangerous. “We have partial automation such as adaptive cruise control, which does some braking and maintains some degree of safety, but is not intended to be an emergency responding system. I find that this distinction is lost on the drivers, who will come to rely upon it. But we all know that automation fails when the situation becomes too complex. In other words, automation fails where it is most needed. Now this is understood by the professionals, but will be a disaster with the everyday person.”

The answer is full automation, or systems that are not automated but are informative, according to the professor. In automobiles, he said, this would mean “a way of enhancing the situation awareness so the person is continually aware of the environment through natural means.” Vibration or natural sounds qualify as enhancements. Auditory announcements, language and vision don’t qualify “because vision is already taken up with the driving task and language announcements are simply viewed as a nuisance.”

Given humans’ taste for the experience of driving – the moment-to-moment control of actions like cornering and accelerating, for instance – it seems improbable that they would hand control to a vehicle. Asked to comment on this observation, the professor noted that full automation is only a stage in an unnoticed process that has been under way for decades. “We already have full automation in many components of the car – anti skid brakes, stability control and also anticipatory braking. If you hit the brake pedal fast but do not depress it all the way, many automobiles will provide full braking power. They believe that if you hit it fast you must have intended full brakes, even if you didn’t do it.”

This is partial automation of the automobile but full automation of the component, he added, which is acceptable. “We have full automation of many of the internal things that used to be manual that people have forgotten about, like spark advance and choke. That has been automated for so long that people aren’t even aware. Even shifting is mostly automated – automatic transmission – so those things work well.” His concerns center on features like lane-keeping controls and adaptive cruise control.

Dr. Norman is confident that in the future, “maybe in one hundred years,” we will have fully automated vehicles. It is the only way to lower the death and injury rate from automobiles, which is uniformly high across the world, he said. “People will complain, ‘I love driving. How can you take driving away from me?’ Well, my response is that there are people who love horseback riding. The answer to that is that there will be special places you can go where you can drive, just as there are special place you can go to ride your horse. But for the everyday life, it will be automatic.”

The interview turned to specific themes in the book, and the professor was asked to explain what he means by “Servants of our machines.” Everything we own requires maintenance and care in this age of proliferating technology, he responded. “Every device we own has a microprocessor in it, whether it’s your toaster, your scale, even your bathroom mirror. Each one of these is going to require some kind of maintenance, some kind of care.”

The maintenance extends to rebooting household items, including bathroom scales and even toilets, he explained, adding that we already have to reboot microwave ovens and refrigerators. “So that even if something requires care only once a year, when we have fifty or even one hundred such items, it means we are continually taking care of them.” He needs to maintain the things in his household every single day, maybe every single hour, he said, citing his computer as a particular burden. A recent problem with his television took two days to fix. “Thoreau many years ago complained that the farmer was driven by the farm implements he had, and for that matter, the demands of the animals, but our lives are being more and more [burdened] by everything we own. The automobile needs servicing. I even speculate that your car will say some day, ‘I’m not going to drive you any more [because] you haven’t had me serviced’”

“I don’t have an answer,” he replied when asked how we can reduce the burden of our machines. “One simple answer is to make things that don’t require attention, and I think everybody would agree, that is not so easy to do.” He sees another answer in engaging a maintenance firm in the way we now hire a cleaning service. “A squad of people will come into my home once a month and go around every piece of equipment and update it and change the virus protection, and oil it and do whatever is required.”

The professor observed that we have always had to spend a lot of time on our own upkeep, and says he is amazed at how much time he has to spend in maintaining himself. The ongoing demands include bathing, brushing and flossing the teeth and having the hair cut. “It’s no surprise that as we add more equipment, especially intelligent, complex equipment into our home that [it requires] even more attention than we do.”

Asked to talk about “Six Rules for the Design of Smart Things,” the professor talked about his approach to the chapter. “On the one hand, I wanted to have a summary of the important findings from my book, which would be more important to the designers, [and] not so important to the everyday reader. … I thought I would try to summarize the lessons of good design of this new automatic equipment.

The Machine’s Point of View

As he started thinking about this, he said, more and more he realized there was a different way of viewing it. “You can view it from the point of view, well I’m designing this automatic equipment and what do you have to know, and I’m trying to tell the human designer. Then I realized you could reverse it. Suppose we had really intelligent machines, and these intelligent machines were frustrated because they had to deal with people who didn’t think and work the way that they did,” and communications were difficult.

He started to phrase the rules in the language of the machines and discovered he could actually make them much more understandable. “So, if you take a human designer who is trying to build a machine, the human designer knows the machine so well [he or she] never thinks of what it is you need to do. … Whereas if you tell the human designer, ‘hey, take it from the machine’s perspective, [which] has to deal with these poor unfortunate people, what would you tell them the machine?’ So that’s what this last section about — the machine’s point of view.”

As he tried to put the rules into a language the machines would understand, he said, an amusing thing happened: “A friend of mine has a child who is somewhat autistic, and [who] has real trouble trying to interact with other people. And the child was always searching the Internet for things that might be helpful to him to understand and work with the people.” He came across my item, the professor added, and realized it answered problems he had been facing. “He found it very valuable. I found it very interesting that I must have hit it just right because it turns out that it’s a value to autistic people.

Though the machine language approach appears to be a fruitful direction for helping autistic people communicate, it proved to be a sensitive issue when the professor considered expanding on the idea. “In the draft of my book, I explained that machines are autistic, that machines do not understand people and therefore the communication between machines and people is strained because we believe the machines are more understanding than they are.” Several friends and the editor asked him to drop the autism angle, he said, and he did. The relationship isn’t brand new. He and friends in human factors have talked at times about autistic machines.”

A major international award – possibly even a Nobel Prize – awaits researchers who succeed in helping people afflicted with autism to break out of their communications prison. Should Dr. Norman take up the issue again, he might need only to rephrase his findings so the language doesn’t upset people with the handicap.

In the meantime, there is the coming book. A tool for mastering automation is likely to find a large audience.

Source: Dr. Don Norman

 

This article originally appeared in The Ergonomics Report™ on 2007-08-10.