Information Technology Reference
In-Depth Information
best a limited channel even compared to punchcards. After extensive testing of
what devices enabled the lowest latency between humans and machines, Engelbart
invented the mouse and other, less successful interfaces, like the one-handed 'chord'
keyboard (Waldrop 2001). By employing these interfaces, the temporal latency
between humans and computers was decreased even further. Strangely enough, we
have not - despite all the hyperbole around tactile or haptic interfaces from various
media-labs - gone far beyond keyboards, mice, and touch-screens in 50 years.
2.1.2
The Internet
The second barrier to be overcome was space, so that any computer should
be accessible regardless of its physical location. The Internet “came out of our
frustration that there were only a limited number of large, powerful research
computers in the country, and that many research investigators who should have
access to them were geographically separated from them” (Leiner et al. 2003).
Licklider's lieutenant Bob Taylor and his successor Larry Roberts contracted out
Bolt, Beranek, and Newman (BBN) to create the Interface Message Processor,
the hardware needed to connect the various time-sharing computers of Licklider's
“galactic network” that evolved into the ARPANet (Waldrop 2001). While BBN
provided the hardware for the ARPANet, the software was left undetermined, so an
informal group of graduate students constituted the Internet Engineering Task Force
(IETF) to create software to run the Internet (Waldrop 2001).
The IETF has historically been the main standardization body that creates
the protocols that run the Internet. It still maintains the informal nature of its
foundation, with no formal structure such as a board of directors, although it is
officially overseen by the Internet Society. The IETF informally credits as their
main organizing principle the credo “We reject kings, presidents, and voting. We
believe in rough consensus and running code” (Hafner and Lyons 1996). Decisions
do not have to be ratified by consensus or even majority voting, but require
only a rough measure of agreement on an idea. The most important product of
these list-serv discussions and meetings are IETF RFCs (Request for Comments)
which differ in their degree of reliability, from the unstable 'Experimental' to the
most stable 'Standards Track.' The RFCs define Internet standards such as URIs
and HTTP (Berners-Lee et al. 1996 2005). RFCs, while not strictly academic
publications, have a de facto normative force on the Internet and therefore on the
Web, and so they will be referenced considerably throughout this topic.
Before the Internet, networks were assumed to be static and closed systems, so
one either communicated with a network or not. However, early network researchers
determined that there could be an “open architecture networking” where a meta-
level “internetworking architecture” would allow diverse networks to connect
to each other, so that “they required that one be used as a component of the
other, rather than acting as a peer of the other in offering end-to-end service”
Search WWH ::




Custom Search