Graphics Reference
In-Depth Information
data from punched cards, performed calculations,
reorganized them and then printed results. The
first automatic calculator was the Mark I (1944)
that used the binary number system. The binary
number system is still the base of the present
computer languages. The machine languages code
electrical states in the computer as combinations
of 0s and 1s. The binary number system uses the
digits 0 and 1 and a base of 2, to convey data and
instructions to the computer. The first electronic
digital calculator called ENIAC was constructed
in 1946 at the University of Pennsylvania. Eniac
weighed 30 tons, covered 1,500 square feet of
floor space, and consumed a lot of electricity with
frequent electricity related failures. The develop-
ments in computer technology in the 1960s and
1970s brought about the evolution of hardware
- the physical equipment of the computer, and
then many types of software packages - programs
that the hardware executes. Sequential advance-
ments in computer technology were described as
generations.
The first-generation computers used vacuum
tubes. The UNIVAC 1 was produced in 1951.
They were very complicated, heavy, expensive,
large and unreliable, but faster than their prede-
cessors. Punched cards still served for putting in
instructions and data. The data, coded in a sym-
bolic language had to be translated into machine
language recorded on a magnetic drum, before
being executed by the computer. Navy Commodore
Grace Murray Hopper developed one of the first
translation programs in 1952.
The second-generation computers (about 1956)
used transistors; they were more reliable than
vacuum tubes, smaller, and faster. Transistors,
soldered to circuit boards, use less power and
generate less heat. Computers had the magnetic
core memory, so they had a larger storage capacity.
They also used magnetic tapes and discs for stor-
age and input/output. The first operating system
was developed; the programming was done with
the machine language and the assemble language;
high-level languages were developed, such as
COBOL and FORTRAN.
The third-generation computers (1964) used
the integrated circuits etched on silicon chips
(developed by Jack S. Kilby of Texas Instruments)
to form the primary storage. Again, computers
became smaller, faster, less expensive, and more
reliable. This involved small-scale integration
technology with only a few transistors, medium
scale integration of integrated circuit chips with
hundreds of transistors. The IBM's System/360
revolutionized the industry because of its main
storage capacity in the central processing unit.
First minicomputers were released, with the
capabilities of the full-size system but smaller
memory. Compatibility among computers be-
came possible due to the introduction of software.
Third generation computers were accessible by
remote terminals.
The fourth generation computers (from 1970)
have been using large-scale integrated circuits
and microprocessors. They are characterized by
increased speed, storage capacity, and versatility.
Large-scale integrated (LSI) circuits progressed
to the Very Large Scale Integration (VLSI) cir-
cuits that contain thousands of transistors densely
packed in a single silicon chip. Microprocessor,
built on a small silicon chip, is the central pro-
cessing unit of a microcomputer; its small size
and great versatility made possible producing
home computer mainframes, minicomputers, and
microcomputers. The next steps in the develop-
ment of the fourth generation computers included
the release of GUI (graphic user interface);
Macintosh SE/30 in 1989, Macintosh Classic in
1990, Macintosh Classic II and Macintosh LC
in 1990. Built-in math coprocessors speed up
computing. In 1989 Tim Berners-Lee invented
World Wide Web, a global information exchange.
Browsers Netscape and Mosaic started in 1994.
These computers have been used in the fields
of simulation, virtual reality, multimedia, data
communication because of their high accessing
speed and storage capacity.
Search WWH ::




Custom Search