Geoscience Reference
In-Depth Information
the universe of alternative spatial interaction models that could be built up from the available pieces
(e.g. variables, parameters, unary and binary operators, standard math functions and reverse pol-
ish rules for well-formed equations) by using evolutionary programming algorithms to breed new
model forms. These methods are explicitly parallel (each member of a population of models is
evaluated in parallel) and also implicitly parallel (the genetic algorithm's schemata theorem). The
problem with AMS was the use of fixed length bit strings. Koza (1992, 1994) describes how this can
be overcome by using what he terms GP. The AMS approach has been redeveloped in a GP format,
which is far more suitable for parallel rather than vector supercomputers. The results from porting
the GP codes on to the Cray T3D suggest that not only can existing conventional models be redis-
covered but that also new model forms with performance levels of two or three times better can be
found (Turton and Openshaw, 1996; Turton et al., 1997; Diplock, 1996, 1998). Some of the GP runs
reported in Turton et al. (1996, 1997) required over 8 h on a 256-processor Cray T3D. It is likely that
2-week long runs on a 512-processor machine would yield even better results, but this is seven times
greater than the total ESRC allocation of Cray T3D time in 1996. In these complex search problems,
the quality of the results depends totally on the available HPC. Runs of this magnitude, which were
barely feasible in 1996, will be considered trivial, and historians will be amazed at how poor the
HPC hardware was. If the new methods work well, then they would constitute a means of extract-
ing knowledge and theories from the increasingly geography data-rich world all around us. The key
point to note here is that it is becoming increasingly possible to compute our way to better models.
Other new approaches to building new types of spatial models are described in Openshaw
(1998c). He compares the performance of a selection of genetic, evolutionary, neural net and fuzzy
logic spatial interaction models. In general, performance improvements of more than 200% over
conventional models are possible and more than sufficient to justify the 10,000-100,000 times more
computation that was involved. Some of these new models are purely black boxes (viz. the neural
network models), but others are capable of plain English expression (the fuzzy logic models) or are
in equation form (derived from AMS or GP). See Beriro et al. (2014) and Heppenstall and Harland
(2014) in this topic for more recent developments in these ields.
Certainly there are problems that still need to be resolved, but GC is about revolutionary tech-
nology. Old truisms may no longer hold good. Old barriers may have gone and have been replaced
by others that are not yet understood. You have to believe that the impossible (i.e. previously the
infeasible) is now possible or else no progress will be made. However, put your GC spectacles on
and suddenly the world is a different and more exciting place, but it still requires you to develop a
degree of self-confidence that you can go safely and carefully where others have yet to tread.
1.7.5 P arallel Z one d eSign and o PtiMal g eograPhical P artitioning
Some other GC applications involve applying existing methods that have been patiently waiting for
increases in the speed of hardware and the provision of GIS data. Zone design is one of these. The
basic algorithms were developed over 30 years ago (Openshaw, 1976, 1978, 1984), but until digital
map boundary data became routinely available in the 1990s and computer hardware much faster, it
was not a practical technology once N (the number of zones) exceeded a small number. The chal-
lenge now is to make routine access to the technology and make available the latest algorithms
(Openshaw and Rao, 1995; Openshaw and Alvanides, 1999). If you can get that far, then you have
to start raising potential user awareness so that they realise what is now possible and start to use it.
Most of the ground work has been done. Parallel zone design codes exist and a parallel simulated
annealing algorithm has been developed; see Openshaw and Schmidt (1996). Yet the principal bar-
rier to application is not algorithmic or HPC aspects but awareness. It is unbelievable that in many
countries the explicit and careful design of sensible census output areas is still not regarded as
important. Surely this application is itself broadly equivalent in importance to many of the HPC
projects in other areas of science, yet because of the absence of a computational culture, it is prob-
ably still regarded as being of the lowest priority and far too advanced for operational use. Yet we
Search WWH ::




Custom Search