Information Technology Reference
In-Depth Information
Captain appears to have eliminated the vibration gauges from his mental model,
because he has found that they do not provide any useful information (because
they are unreliable). If the captain had looked closely at the EIS, he may have
observed information about the engines that would have changed how the flight
crew dealt with the engine problems.
Technology Issues
The EIS which was fitted to the B737-400 used digital rather than analogue displays.
A subsequent survey showed that nearly two-thirds of BMA pilots believed that the
new EIS was not effective in drawing their attention to rapid changes in the engine
parameters, and nearly three-quarters preferred the old EIS. Thus, the system
designers of the EIS and the training could be deemed to have contributed to the
accident. It appears that the pilots of BMA (at least) were not involved in carrying
out any evaluation of the new EIS before they had to use it in flight.
External Issues
When the aircraft was in sight of the airport, the #1 engine finally failed completely.
There was not enough time to restart the #2 engine, and the aircraft ended up landing
on the M1 (one of the UK's main motorways). This road had had noise abatement
embankments (small hills) put up to shelter the surrounding land from motorway
noise. This caused the plane to bounce, and probably compounded the crash.
Summary
The formal accident investigation attributed the cause of the accident to pilot error.
As you look through the description of what happened, and the list of contributory
events, you should start to appreciate that maybe it was a series of mistakes, errors,
and bad luck from a wide range of people who were part of the broad system.
During a normal flight there are several things happening at the same time at
different levels within the air transport system, and the flight crew has to deal with
many of them. In the vast majority of cases, all the tasks are performed
successfully, and the flight arrives safely at its destination and in a timely manner.
It is often only when things go wrong, however, that you really begin to understand
just how complicated getting a plane full of passengers from its original airport to
its destination can be.
Reference
Besnard, D., Greathead, D., & Baxter, G. (2004). When mental models go wrong.
Co-occurrences in dynamic, critical systems. International Journal of Human-
Computer Studies,60(60), 117-128.
Search WWH ::




Custom Search