Information Technology Reference
In-Depth Information
will be eradicated. But as Leveson and Turner write, “There is always another software
bug” [48].
The real problem was that the system was not designed to be fail-safe. Good en-
gineering practice dictates that a system should be designed so that no single point of
failure leads to a catastrophe. By relying completely upon software for protection against
overdoses, the Therac-25 designers ignored this fundamental engineering principle.
Another flaw in the design of the Therac-25 was its lack of software or hardware
devices to detect and report overdoses and shut down the accelerator immediately. In-
stead, the Therac-25 designers left it up to the patients to report when they had received
overdoses.
There are also particular software lessons we can learn from the case of the Therac-
25. First, it is very difficult to find software errors in programs where multiple tasks
execute at the same time and interact through shared variables. Second, the software
design needs to be as simple as possible, and design decisions must be documented to
aid in the maintenance of the system. Third, the code must be reasonably documented
at the time it is written. Fourth, reusing code does not always increase the quality of the
final product. AECL assumed that by reusing code from the Therac-6 and Therac-20,
the software would be more reliable. After all, the code had been part of systems used
by customers for years with no problems. This assumption turned out to be wrong. The
earlier codes did contain errors, but these errors remained undetected because the earlier
machines had hardware interlocks that prevented the computer's erroneous commands
from harming patients.
The tragedy was compounded because AECL did not communicate fully with its
customers. For example, AECL told the physicists in Washington and Texas that an
overdose was impossible, even though AECL had already been sued by the patient in
Georgia.
8.5.5 Moral Responsibility of the Therac-25 Team
Should the developers and managers at AECL be held morally responsible for the deaths
resulting from the use of the Therac-25 they produced?
In order for a moral agent to be responsible for a harmful event, two conditions
must hold:
. Causal condition : the actions (or inactions) of the agent must have caused the harm.
. Mental condition : the actions (or inactions) must have been intended or willed by
the agent.
In this case, the causal condition is easy to establish. The deaths resulted both from
the action of AECL employees (creating the therapy machine that administered the
overdose) and the inaction of AECL employees (failing to withdraw the machine from
service or even inform other users of the machine that there had been overdoses).
What about the second condition? Surely the engineers at AECL did not intend
or try to create a machine that would administer lethal overdoses of radiation. How-
ever, philosophers also extend the mental condition to include unintended harm if the
 
 
Search WWH ::




Custom Search