Information Technology Reference
In-Depth Information
obvious that there were time pressures involved. The PF knew that he had to
respond relatively quickly whilst still listening to the PNF's advice. Perhaps if the
PF had been more experienced, and the PNF had not felt the need to talk the PF
through the arrival route, neither of them would have been distracted from the task
at hand: the PF would have been more likely to level off at the appropriate altitude,
and the PNF more likely to detect any anomalies in the plane's status.
The air accident at Kegworth in the UK (Air Accidents Investigation Branch
1989 ), described in the Appendix offers another example of a failure that was
officially attributed to pilot (human) error. One of the problems was the fact that
when the crew shut down the good engine, this coincided with a reduction in
vibration, and a cessation of the smoke and fumes from the faulty engine. This led
the crew to believe that they had taken the correct action, which can be at least
partly attributed to the use of a flawed mental model (Besnard et al. 2004 ).
It is also important to ensure that appropriate account is taken of the physio-
logical limitations of users as well as their psychological limitations. A simple
example is the original design of packaging for medication tablets (and some other
potentially hazardous household items such as domestic bleach). It used to be quite
easy for a young child to unscrew the cap from a medicine bottle and then eat the
contents because they looked like sweets. The solution was the child-proof safety
cap. Although children could not open them in tests, older people also found it
difficult to open the cap, particularly if they suffered from arthritis. In complex
situations, such as flying an airplane (and particularly smaller ones), the issues
involved may be more subtle. Here it is important that the pilot is not asked to do
things like move their head in one direction whilst the aircraft is moving in
another, because this can lead to severe disorientation.
The design limitations of the system also need to be taken into account. What
often happens is that there is a general expectation that the human operator should
compensate for any inadequacies in system design. Usually training is used to
bridge the gap, but sometimes users are simply left to work it out for themselves.
The technique of Crew—originally Cockpit—Resource Management (CRM)
was developed (Wiener et al. 1993 ) to anticipate some potential problems that can
arise from the interactions between people, technology, and context within avia-
tion. CRM aims to minimize the potential for failures in interpersonal commu-
nications at crucial times during a flight, for example, which can lead to real
accidents such as:
• A plane crashing on take-off because the distracted crew failed to complete a
safety checklist that would have confirmed that the aircraft's flaps had not been
extended.
• A plane crashing into a river when the co-pilot failed to get the attention of the
Captain about concerns that the take-off thrust had not been properly set. The
co-pilot felt that he could not tell his superior what to do.
• A plane crashing when it ran out of fuel due to a communications breakdown
between the Captain, the co-pilot, and air traffic control about the amount of fuel
onboard.
Search WWH ::




Custom Search