Information Technology Reference
In-Depth Information
electric charges stored in memory chips in a client device, such as a web browser
on a mobile phone. These messages are often called the realization of some abstract
informational content.
Already, information reveals itself to be not just a singular thing, but something
that exists at multiple levels: How do the bits become a message in HTTP? In
particular, we are interested in the distinction in information between content and
encoding. Here our vague analogy with Shannon's information theory fails, as
Shannon's theory deals with finding the optimal encoding and size of channel so
that the message can be guaranteed to get from the sender to the receiver, which in
our case is taken care of by the clever behavior of the TCP/IP protocol operating
over a variety of computational devices (Shannon and Weaver 1963). Yet, how can
an encoding be distinguished from the content of information itself in a particular
HTTP message? Let's go back to bits by leaning on aesthetic theory of all things; art
critic and philosopher Nelson Goodman defines a mark as a physical characteristic
ranging from marks on paper one can use to discern alphabetic characters to ranges
of voltage that can be thought of as bits (1968). To be reliable in conveying
information, an encoding should be physically 'differentiable' and thus maintain
what Goodman calls 'character indifference' so that (at least within some context)
each character (as in 'characteristic') can not be mistaken for another character. One
cannot reconstruct a message in bits if one cannot tell apart 1 and 0, much as one
cannot reconstruct a HTML web-page if one cannot tell the various characters in
text apart. So, an encoding is a set of precise regularities that can be realized by
the message . Thus, one can think of multiple levels of encoding, with the very basic
encoding of bits being handled by the protocol TCP/IP, and then the protocol HTTP
handing higher-level encodings in textual encodings such as HTML.
Unforunately, we are not out of the conceptual thicket yet; there is more
to information than encoding. Shannon's theory does not explain the notion of
information fully, since giving someone the number of bits that a message contains
does not tell the receiver what information is encoded. Shannon explicitly states,
“The fundamental problem of communication is that of reproducing at one point
either exactly or approximately a message selected at another point. Frequently
the messages have meaning; that is they refer to or are correlated according to
some system with certain physical or conceptual entities. These semantic aspects
of communication are irrelevant to the engineering problem” (1963). He is correct,
at least for his particular engineering problem. However, Shannon's use of the term
'information' is for our purposes the same as the 'encoding' of information, but
a more fully-fledged notion of information is needed. Many intuitions about the
notion of information have to deal with not only how the information is encoded
or how to encode it, but what a particular message is about, the content of an
information-bearing message. 8
'Content' is a term we adopt from Israel and Perry,
8 An example of the distinguishment between content and encoding: Imagine Daniel sending Amy
a secret message about which one of her co-employees won a trip to the Eiffel Tower. Just
determining that a single employee out of 8 won the lottery requires at least a 3 bit encoding
Search WWH ::




Custom Search