Humans may err, but computers are supposed to be accurate all the time. Except, of course, they're not. And as humans rely more and more on technology, they have to make allowances for the error factor of both the hardware and the people operating it, researchers are finding.
In recent studies conducted in flight simulators, pilots who relied solely on automated decision aids -- designed to reduce human error -- often found themselves the victims of unintended consequences that might have proved deadly in an actual flight.
According to University of Illinois at Chicago psychologist Linda Skitka, who has been studying the phenomenon with a teammate for five years, people working with computerized systems are prone to two kinds of errors.
First, when they are told by a computer to do a task, many do it without double-checking the machine's accuracy, despite the fact they've been told the system is not fail-safe. The researchers dubbed this an error of commission.
For example, the test pilots were told to go through a five-step checklist to determine whether or not an engine was on fire. One of the elements was a computerized warning signal. When they received the signal, the pilots all turned off the defective engine -- without running through the other four steps.
It turned out that a completely different engine had been on fire. When asked about their decision, all the pilots said they had run through the entire checklist when in fact they had not.
"Most of these systems are being designed by engineers who think the way to get rid of human error is to engineer the human out of the equation," said Skitka. "To some extent, that's right. But to the extent that we still have human operators in the system, we need to take a look at the human-computer interaction and be more sensitive to the human side."
The second common mistake, which researchers classified as an error of omission, takes place when a computer fails to detect a mishap and human operators miss it too because they haven't run through a manual checklist.
It was an error of omission that led to the crash of a Korean Air jet in 1983 after being shot down over Soviet airspace, Skitka said. The pilot allegedly never double-checked the autopilot program to make sure it was following the correct flight path. It wasn't, she said.
Indeed, in studying anonymous near-accident reports filed with the airlines by pilots, Skitka found that many mistakes involved pilots programming the flight computer to do specific tasks but not bothering to check that it was performing those tasks.
The studies were conducted at the NASA Ames Research Center in California and at the University of Illinois and have left Skitka suspicious of any task that involves highly technical systems that monitor events. That includes work in the nuclear energy and shipping industries and even hospital intensive care units, where monitors are relied on for life-and-death decisions, she said.
Better technical design and operator training are potential solutions, she said. Perhaps the biggest problem is that many of the tasks that need to be performed in automated situations are dull. Those tasks need somehow to be made more interesting so humans don't go into autopilot themselves, she said.
"I'm still a fan of automation but now we've introduced new possibilities for human error," said Skitka. "(Computers) are never going to be able to be programmed for every possible contingency. We have to make sure we keep that human factor in our equation."