Geoscience Reference
In-Depth Information
model. Developments in information technology over the last decade have dramatically increased
the availability and sizes of spatial interaction data sets. The 1991 census provides journey to work
and migration data that contain 10,764 origin and destination zones. A parallel version of Equation
1.1 has been run on the KSR parallel supercomputer at Manchester and later ported on to the Cray
T3D (see Turton and Openshaw, 1996).
Scalability is a very important property in the world of parallel HPC. It creates new modelling
opportunities applicable to the modelling of large-scale interaction data. Telephone traffic data exist
for entire countries. In the United Kingdom, it is possible to imagine telephone call flow databases
with between 1.6 and 27 million zones in them. Equivalent data are generated by EFTPOS flows in
the retail sector. These databases, currently being stored in data warehouses, are also of profound
substantive interest since their data portray the microfunctioning of selected aspects of the entire
UK economic space. The daily trivia of a complete living nation is in there, just awaiting analysis.
Retail catchments, changing network effects and space-time dynamics of individual behaviours are
all in there, somewhere. The spatial interaction model could be scaled up to model only some of it
and clearly entirely new modelling methodologies will be needed. Yet the possibilities are almost
endless if we have the imagination to create them and the HPC hardware is sufficiently large and
fast to meet the computational challenge.
Computer technology able to model the behaviour of atoms will soon be able to model more
and more of the behaviour in space and time of millions of individual people. As global resources
become more limited, as environment concerns increasingly require behaviour modification and
as governments aim at a lifelong equality consensus, the task of people management will increase.
However, better planning requires better prediction modelling. We need to be able to model people's
behaviour if much progress is going to be made. The problem at present is that the science of human
systems modelling (as it has been termed) is still at an extremely rudimentary stage of development;
see Openshaw (1995a) for a brief review. Nearly all the existing models are aggregate rather than
micro, static rather than dynamic and insufficiently non-linear to be of much use. A start has been
made but so much more is still needed.
1.7.2 n ew P araMeter e StiMation M ethodS
Not all of GC needs the use of very large data sets or requires massive software investment or access
to leading edge HPC. Diplock and Openshaw (1996) demonstrate some of the benefits of using
genetic and evolutionary strategy-based parameter estimation methods compared with conventional
non-linear optimisation methods. Computer models (e.g. the spatial interaction model in Equation
1.1) with exponential terms in them contain considerable opportunities for arithmetic instabilities
to arise because the exponential deterrence function can readily generate very large and very small
numbers depending on the parameter b . In fact, the numeric range where there are no arithmetic
protection conditions being generated is extremely small (typically plus or minus one depending on
how the C ij values are scaled) given that the parameter b could in theory range from minus infinity to
plus infinity. The problem becomes worse when more parameters are used. Yet it is this function land-
scape of flat regions, vertical cliffs and narrow valleys leading to the optimal result that conventional
parameter optimisation methods have to search. If they hit any of the barriers or the flat regions, they
tend to become stuck, and because it is dumb technology, they have no way of telling you that this has
happened. The implications are that potentially all statistical and mathematical models with exponen-
tial terms in them can produce the wrong result because there is no assurance that the conventional
non-linear optimisers in current use can safely handle the invisible arithmetic problems. There are
newer methods which will function well on these problems since they are more robust, they are not
affected by floating point arithmetic problems, and they can handle functions which are non-convex
and discontinuous and have multiple suboptima (see Diplock and Openshaw, 1996; Heppenstall and
Harland, 2014). The problem is that they require about 1000 times more computation. Once it was
impossible to use this technology except on a small scale. Now it can be far more widely applied.
Search WWH ::




Custom Search