Information Technology Reference
In-Depth Information
as opposed to the more confusing term 'semantic information' as employed by
Floridi and Dretske (Israel and Perry 1990; Dretske 1981; Floridi 2004). One of
the first attempts to formulate a theory of informational content was due to Carnap
and Bar-Hillel (1952). Their theory attempted to bind a theory of content closely
to first-order predicate logic, and so while their “theory lies explicitly and wholly
within semantics” they explicitly do not address “the information which the sender
intended to convey by transmitting a certain message nor about the information
a receiver obtained with a certain message,” since they believed these notions
could eventually be derived from their formal apparatus (Carnap and Bar-Hillel
1952). Their overly restrictive notion of the content of information as logic did
not gain widespread traction, and neither did other attempts to develop alternative
theories of information such as that of Donald McKay (1955). In contrast, Dretske's
semantic theory of information defines the notion of content to be compatible with
Shannon's information theory, and his notions have gained some traction within the
philosophical community (Dretske 1981). To him, the content of a message and
the amount of information - the number of bits an encoding would require - are
different, for “saying 'There is a gnu in my backyard' does not have more content
than the utterance 'There is a dog in my backyard' since the former is, statistically,
less probable” (Dretske 1981). According to Shannon, there is more information
in the former case precisely because it is less likely than the latter (Dretske 1981).
So while information that is less frequent may require a larger number of bits in
encoding, the content of information should be viewed as to some extent separable
if compatible with Shannon's information theory, since otherwise one is led to
the “absurd view that among competent speakers of language, gibberish has more
meaning than semantic discourse because it is much less frequent” (Dretske 1981).
Simply put, Shannon and Dretkse are talking about distinct notions that should be
separated, the notions of encoding and content respectively.
Is there a way to precisely define the content of a message? Dretske defines the
content of information as “a signal r carries the information that s is F when the
conditional probability of s 's being F ,given r (and k )is1(but,given k alone, less
than 1). k is the knowledge of the receiver” (1981). To simplify, the content of any
and does not tell Amy (the receiver) which employee in particular won the lottery. Shannon's
theory only measures how many bits are needed to tell Amy precisely who won. After all, the false
message that her office-mate Sandro won a trip to Paris is also 3 bits. Yet content is not independent
of the encoding, for content is conveyed by virtue of a particular encoding and a particular encoding
imposes constraints on what content can be sent (Shannon and Weaver 1963). Let's imagine that
Daniel is using a code of bits specially designed for this problem, rather than natural language, to
tell Amy who won the free plane ticket to Paris. The content of the encoding 001 could be yet
another co-employee Ralph while the content of the encoding 010 could be Sandro. If there are
only two possible bits of information and all eight employees need one unique encoding, Daniel
cannot send a message specifying which friend got the trip since there aren't enough options in the
encodings to go round. An encoding of at least 3 bits is needed to give each employee a unique
encoding. If 01 has the content that 'either Sandro or Ralph won the ticket' the message has
not been successfully transferred if the purpose of the message is to tell Amy
precisely which
employee won the ticket.
Search WWH ::




Custom Search