Information Technology Reference
In-Depth Information
1978: Hartford Coliseum Collapse
The Hartford Coliseum was designed using a computer-aided design (CAD) soft-
ware package. The designer assumed only vertical stress on the support columns.
When one column collapsed from the weight of snow, lateral forces were applied
to surrounding columns that had not been designed to take lateral stress.
Lessons learned: The lesson from this failure is more about the human mind than
about software per se. The assumption of pure vertical compression was faulty,
and that was a human error.
Problem avoidance: This problem could have been found via inspections and
probably by requirements modeling. The problem is unlikely to have been found
via static analysis since it was a problem of logic and design and not of syntax. Be-
cause the problem was one of design, pair programming might not have worked.
Ideally, having someone on the inspection or modeling team with experience in
structures designed for heavy snow might have broadened the assumptions.
Finding the problem by testing should have occurred, but there is a caveat. If
the same designer with the faulty assumption wrote the test cases, he or she would
not have included tests for lateral stress. A certified professional tester might have
found this, but perhaps not. Risk-based testing in today's world might have found
the problem.
1983: Soviet Early Warning System
In 1983, the Soviet early warning system falsely identified five incoming missiles
that were assumed to have been launched by the United States. Rules of engage-
ment called for a reprisal launch of missiles against the United States, which could
have led to World War III.
Fortunately, the Soviet duty officer was an intelligent person, and he reasoned
that the United States would never attack with only five missiles, so he concluded
it was a false alarm. Apparently, the early warning system was confused by sun-
light reflected from clouds.
Lessons learned: The lesson learned from this problem is that complex problems
with many facets are hard to embody in software without leaving something out.
A second lesson is that bugs in major military applications can have vast uninten-
ded consequences that could possibly cause the deaths of millions.
Problem avoidance: This problem might have been found by inspections with ex-
perienced military personnel as part of the inspection team. The problem might
also have been found by requirements modeling.
Search WWH ::




Custom Search