Geoscience Reference
In-Depth Information
3. To permit larger databases to be analysed and/or to obtain better results by being able to
process finer resolution data
4. To develop new approaches and new methods based on computational technologies devel-
oped by other disciplines, particularly AI and computer vision, new ways of solving opti-
misation problems and generally to become opportunistic and entrepreneurial with a
concern to tackle old problems using new technologies and also to do new things that are
relevant to geographical concerns but are currently limited by processor speed and perhaps
also memory size
All are important although some are much more readily attainable than others. In some applications,
there are almost instant benefits that can be gained with a minimal degree of effort. Yet in others,
it could be 5-10 years before immature research blossoms into something useful. One problem for
geographical HPC is that users in other areas of science have a considerable head start in developing
technology and raising awareness levels within their research communities and have research coun-
cils that now respond to their needs for HPC. Other problems are of their own making, for example,
the various paradigm wars and artificially self-constructed philosophical and attitudinal barriers.
Methodological pluralism is good but tolerance is also a necessary condition. Nevertheless, there is
a growing belief that the time is ripe for HPC initiatives in geography and the social sciences and
the international growth in popularity of GC is one indicator of this change.
1.7 SOME EXAMPLES OF OLD AND NEW GEOCOMPUTATION
Even though GC is a new term, it is possible to recognise applications that today would be called GC
but previously were regarded either as quantitative geography or as GIS or spatial analysis. Some
examples may help understand better the GC ethos or style and how GC fits in with what quantita-
tive minded researchers have always done.
1.7.1 P arallel S Patial i interaction M odelling and l location o PtiMiSation
One of the earliest uses of parallel computing in geography has concerned the parallelisation of the
spatial interaction model; see Harris (1985) and Openshaw (1987). This model is central to several
historically important areas of regional science, urban and regional planning and spatial decision
support (Wilson, 1974; Birkin et al., 1996). For illustrative purposes, the simplest spatial interaction
model can be expressed as
TAODB
=
exp(
bC
)
(1.1)
Tij
ii Tij
j
Tij
where
T Tij is the predicted flows from origin i to destination j
A i is an origin constraint term
O i is the size of origin zone i
D j is the attractiveness of destination i
C Tij is the distance or cost of going from origin i to destination j
b is a parameter that has to be estimated
This model was originally derived in a theoretically rigorous way by Wilson (1970) using an entropy
maximising method. Clearly this model is implicitly highly parallel since each T Tij value can be com-
puted independently. Parallelisation here is important because the model presents a computational
challenge since computer times increase with the square of the number of zones ( N ). Small N values
can be run on a PC but large N values need a supercomputer. The quality of the science reflects both
the number of zones (more zones provide better resolution than few) and the specification of the
Search WWH ::




Custom Search