Information Technology Reference
In-Depth Information
CHAPTER 7
Justification of Computer Cognition Development Based on
Human
Introduction
The Application section includes a proposal to use information gained from the Breadth and
Depth segments for improvement in cognitive human development leading to the design of
computers in order to process data and information. The technology was intended to aid hu-
man technological awareness and to facilitate social change. The author examines the cog-
nitive human thinking development index that effected the creation and design of RAM and
CPUs. The topic has chronologically outlined the progress made by computer science pro-
fessionals or personnel and information technology engineers as a result of such cognitive
human development, ranging from the time of primitive man to the current space age. The
topic contains a consideration of the concepts and other technological coordination facilit-
ated by cognitive human development and the inspiration behind the concepts through rel-
evant and attentive practical application of the development and how it benefited and social-
ized mankind.
Background of Computer Origin
The concept of computer machines is prevalent in the theoretical foundation of science of
semiconductor materials using human cognitive capability research. The theoretical founda-
tion conceptualized science of the computer through the use of these semiconductor mater-
ials . The computer machine was an imaginary, not quite hypothetical computer invented in
1936 by English mathematician Alan Turing (1912-1954) to help solve a question in math-
ematical logic. As a by-product, Turing founded a new field of research known as compu-
tation theory or computability, which includes the study of the abilities and limitations of
digital computers due to human limitations.
Although Alan's rather implausible computer benefits from its extreme simplicity, the ba-
sic machine performs just a few simple operations. If the machine did anything less than
what it was intended for, it would not do anything at all. Yet many research studies have
justified that through combinations of simple operations using logical gates ( and, or, anand,
nor, not, and exclusive or/nor ), the machine could then perform any computation that can
be performed on modern digital computers (Petzold 2008, p. 77). By stripping a computer
down to the raw basics, one can better understand the abilities of digital computers and, just
Search WWH ::




Custom Search