Information Technology Reference
In-Depth Information
“qualifi ed” programmer; calls to replace “pseudo-artists [programmers]
by engineers and to treat programming as a normal branch of engineer-
ing”; and rising costs and increased incidence of failure. 64 In 1987, the
editors of Computerworld complained that “the average software project
is often one year behind plan and 100% over budget.” 65 Two years later
the House Committee on Science, Space, and Technology released a
report highly critical of the “shoot-from-the-hip” culture of the software
industry. Later that same year the Pentagon launched a broad campaign
to “lick its software problems” that included funds for a Software
Engineering Institute and the widespread adoption of the ADA program-
ming language. 66 The list of critical reports, denunciations of contempo-
rary methodologies, and proposed silver bullet solutions continued to
grow. And yet, in the words of one industry observer, by the mid-1980s
“the software crisis has become less a turning point than a way of life.” 67
In the late 1990s the Y2K crisis called new public attention to this
long-standing debate; in many respects, however, it added little to an
already-established discourse. It is a rare article on software engineering
that does not make at least a token reference to the ongoing crisis. The
legacy of the past continues to shape the possibilities of the future.
Computing as a Human Activity
It is tempting, from the vantage point of the early twenty-fi rst century,
to view the widespread adoption of the electronic computer as an uncom-
plicated and technologically determined process, driven by the growing
informational demands of modern scientifi c, corporate, and governmen-
tal organizations along with the obvious superiority of the general-
purpose, programmable digital computer as a tool for managing and
manipulating information. Indeed, from a modern perspective, it is dif-
fi cult to imagine a more obviously useful and desirable technology. The
inherently protean nature of the electronic computer—its ability to be
easily reconfi gured, via software, to accomplish an almost infi nite number
of applications—combined with regular and impressive improvements in
the underlying hardware makes the computerization of modern society
seem, in retrospect, overdetermined, almost inevitable.
But like all great social and technological innovations, the computer
revolution of the previous century didn't just happen. It had to be made
to happen, and it had to be made to happen by individual people, not
impersonal processes. One of the most signifi cant and lasting insights of
recent scholarship in the history of technology is that technological
Search WWH ::




Custom Search