Geoscience Reference
In-Depth Information
and feasibility. It can be argued that GC is
new
because until recently computers were neither fast
enough nor possessed of sufficient memory to make a GC paradigm into a practical proposition
capable of general application. This is no longer the case as the 1990s have witnessed the develop-
ment of highly parallel supercomputers; for instance, the Cray T3D (around 1994) had 512 proces-
sors and 32 GB of memory and was rated at about 40 gigaflops, whilst the Cray T3E (around 1998)
had 576 processors and 148 GB of RAM and was rated at about 122 gigaflops of sustained speed.
We are now in an era of petaflop computing.
Do we actually need extra computational power? Most geographers probably think that their PC
with all the power of a mid-1980s mainframe is more than they need to run their 1970s and early
1980s vintage modelling and statistical technologies! In some ways, they are correct but this is the
wrong perspective to apply in an era where HPC offers or promises 10,000 or more times that level
of performance. Nearly all the mathematical models and statistical analysis tools used today in
geography come from an era of either manual calculation or slow and small computers. They use
shortcuts, numerous simplifications, etc., to minimise the amount of computation that is performed.
Indeed most of our computational technology is old fashioned and outmoded and likely to yield far
poorer results than more leading edge tools. However, it is important to be able to demonstrate that
if we perform 10,000 or several million times more computation, the benefits are worthwhile. If GC
is to survive, then it is this challenge that needs to be convincingly addressed.
Macmillan (1998) observes that there are strong elements of continuity between GC and the
established traditions of quantitative geography because they share the same scientific philosophy.
The view here is that GC can easily encompass quantitative geography if it so wished and if there
was some virtue in so doing. However, GC is more than quantitative geography ever aspired to.
Today's quantitative geography can be regarded as a repository for various legacy statistical and
mathematical technologies that reflect an era of slow computers. Quantitative geography was a
computationally minimising technology, reflecting its origins in a hand-computer era. Analytical
approximation and clever mathematical manipulation had to substitute for the lack of computing
power. The absence of data fostered a theoretical perspective because there was seldom any other
possibility. It is this technology and outlook that still survives in modern quantitative geography.
The idea of running a computer program for a month or a year is still something quite alien. Indeed
it is only in the last 5 years that computing environments have changed by such a degree that large-
scale computation is now routinely feasible. In 1998, it can be calculated that 12 h on a 512-processor
Cray T3E parallel supercomputer is broadly equivalent to somewhere between 4 and 8 years of non-
stop computing on a top-end workstation or PC. This is the emerging world within which GC is
located; it was never the world of quantitative geography where such vast amounts of computation
were seldom envisaged. The challenge for GC is to develop the ideas, the methods, the models and
the paradigms able to use the increasing computer speeds to do
useful
,
worthwhile
,
innovative
and
new
science in a variety of geo-contexts.
1.5 WHAT DO OTHERS SAY ABOUT GEOCOMPUTATION?
The definitions of GC described here were those which were expressed by the author during the
time of the First International Conference in GeoComputation held in Leeds in September 1996.
Since then, thinking about the subject has intensified following two other conferences. The question
now is, to what extent do subsequent writers agree or disagree with these suggestions? Although, to
be fair, this work has not previously been published and is thus largely unknown to them.
Rees and Turton (1998, p. 1835) define
GeoComputation
as '… the process of applying comput-
ing technology to geographical problems'. At first sight, this definition would appear to suggest that
GC is equivalent to doing geography with a computer. However, it has to be understood that there
is an important distinction between 'doing geography with a computer' (which could be using a
computer to map data) and 'solving geographical problems with new computing power' (which is
what Rees and Turton wrote about).