Geography Reference
In-Depth Information
Force for Change #3: The Evolution of the Garden-Variety Computer
System Has Been Amazing
To compare computers, we might use this formula as a gross measure of computing power (P):
P ~ (S*M)/$
to be read as P is proportional to “computing speed” times “primary memory size” divided by “machine
cost.” Let's look at the change of computing power in the past 50-odd years (since the second-generation
machines came out, using transistors instead of electron tubes).
A popular machine in 1960 (the IBM 1620) had a memory of 20,000 bytes. A popular machine today
could have 4 billion bytes (4 gigabytes). So today's machine has on the order of 200,000 times as
much memory.
The IBM 1620 had a clock speed of 50 kilohertz. Today's machine might run at 2.5 gigahertz. That's
another factor of 50,000.
The cost of the IBM 1620 in 1960 was about $100,000 or, considering inflation $500,000 in today's dol-
lars. Today a popular machine might cost $2,000. This gives us a factor of 250.
So one could say that, considering cost, today's machine was 200,000 × 50,000 × 250 times as powerful as
one in 1960. That's a factor of 2.5 trillion. To put the idea of a “factor” in perspective, consider a factor of
“2.” Let's say your income was doubled (or halved). That's a factor of 2. Quite an effect, no? Or consider
a factor of “10”—say, the speed of a car compared to a human running speed, or the speed of an airliner
over a car. A factor of 10 changes the nature of whatever is being considered. What does a factor of
2.5 trillion do? Boggles the mind, that's what. Of course, much of that increase is used up with graphics,
poor programming, and nonuse (computers which are only typed on are loafing along at about 1 percent
of their capacity, no matter how fast you type). Still, when the chips are down—so to speak—today's
machines are quite amazing.
In addition to the sheer increase in power of today's computers, the hardware has become much more
reliable, so that the failures are quite rare, even over years. However, this is much more than offset by the
fact that software has become less reliable. Software vendors tend to want to get their products on the
market quickly, whether they are buggy or not. 2
Spatial Data
When location or position is used as a primary referencing basis for data, the data involved are known as
spatial data . For example, the elevations, in feet, of the landmarks Clingman's Dome and Newfound Gap
in the Great Smoky Mountains National Park are data. If the primary referencing basis for these data is
the “Great Smoky National Park,” or x miles south of Gatlinburg, Tennessee, on U.S. Highway 441, or
p degrees latitude and q degrees longitude, then the elevations could be referred to as spatial data.
2 It is my opinion that 30 percent of those who fund and supervise computer programming should have to be
examined, remedially trained where necessary, and then certified. The other 70 percent should be taken out and shot.
(I'm joking, of course—but there are major problems with the software produced in the last decade or so.)
 
 
Search WWH ::




Custom Search