Information Technology Reference
In-Depth Information
development of the relational calculus as a means for the efficient reformulation of database
queries.
SPACE-TIME TRADEOFFS: An important byproduct of the work on formal languages and se-
mantics in the 1970s is the pebble game. In this game, played on a directed acyclic graph,
pebbles are placed on vertices to indicate that the value associated with a vertex is located in
the register of a central processing unit. The game allows the study of tradeoffs between the
number of pebbles (or registers) and time (the number of pebble placements) and leads to
space-time product inequalities for individual problems. These ideas were generalized in the
1980s to branching programs.
VLSI MODEL: When the very large-scale integration (VLSI) of electronic components onto
semiconductor chips emerged in the 1970s, VLSI models for them were introduced and an-
alyzed. Ideas from the study of pebble games were applied and led to tradeoff inequalities
relating the complexity of a problem to products such as AT 2 ,where A is the area of a chip
and T is the number of steps it takes to solve a problem. In the late 1970s and 1980s the
layout of computers on VLSI chips also became an important research topic.
ALGORITHMS AND DATA STRUCTURES: While algorithms (models for programs) and data struc-
tures were introduced from the beginning of the field, they experienced a flowering in the
1970s and 1980s. Knuth was most influential in this development, as later were Aho, Hopcroft,
and Ullman. New algorithms were invented for sorting, data storage and retrieval, problems on
graphs, polynomial evaluation, solving linear systems of equations, computational geometry,
and many other topics on both serial and parallel machines.
1.1.5 1980s and 1990s
PARALLEL COMPUTING AND I/O COMPLEXITY: The 1980s also saw the emergence of many
other theoretical computer science research topics, including parallel and distributed comput-
ing, cryptography, and I/O complexity. A variety of concrete and abstract models of parallel
computers were developed, ranging from VLSI-based models to the parallel random-access
machine (PRAM), a collection of synchronous processors alternately reading from and writ-
ing to a common array of memory cells and computing locally. Parallel algorithms and data
structures were developed, as were classifications of problems according to the extent to which
they are parallelizable. I/O complexity, the study of data movement among memory units
in a memory hierarchy, emerged around 1980. Memory hierarchies take advantage of the
temporal and spatial locality of problems to simulate fast, expensive memories with slow and
inexpensive ones.
DISTRIBUTED COMPUTING: The emergence of networks of computers brought to light some
hard logical problems that led to a theory of distributed computing, that is, computing with
multiple and potentially asynchronous processors that may be widely dispersed. The prob-
lems addressed in this area include reaching consensus in the presence of malicious adversaries,
handling processor failures, and efficiently coordinating the activities of agents when interpro-
cessor latencies are large. Although some of the problems addressed in distributed computing
were first introduced in the 1950s, this topic is associated with the 1980s and 1990s.
Search WWH ::




Custom Search