Information Technology Reference
In-Depth Information
Shannon and Weaver, we can calculate the difference made by 'knowing' H2 is the
case from the difference in entropy of the two situations:
Entropy of H1
=−
(0
.
5Log
2
(0
.
5)
+
0
.
5Log
2
(0
.
5))
=−
((
−
0
.
5)
+
(
−
0
.
5)
=
1bit
Entropy of H2
=−
(1
.
0Log
2
(1
.
0)
+
0
.
0Log
2
(0
.
0))
=−
((
−
0
.
0)
+
(
−
0
.
0)
=
0 bit
So the difference made by 'knowing H2 rather than H1 is the case' is (1 - 0)
1 bit.
The effect of the new information is mediated by an agent's current beliefs about
the world. Suppose an agen
t's initial confidence has in e
ach of these hypotheses is:
=
Agent
E
n
−
1
(H1)
0.8
E
n
−
1
(H2)
0.2
Total
1.0
Then we can calculate the effect of an experiment (tossing the coin) as follows.
Using:
E
n
−
1
(H
/
Re )
=
E
n
−
1
(Re
/
H)
/
E
n
−
1
(Re)
Agent
E
n
−
1
(R
e
/H)
=
E (Head/H)
E (Tail/H)
Total
E
n
−
1(H) * P(Re/H)
E
n
−
1
(H1) * P(Result/H1)
0.8 * 0.5
=
0.4
0.8 * 0.5
=
0.4
0.8
E
n
−
1
(H2) * P(Result/H2)
0.2 * 1.0
=
0.2
0.2 * 0.0
=
0.0
0.2
E(R
e
)
0.6
0.4
1.0
We can then calculate:
Agent
E
n
(H/R
e
)
Head occurs
Tail occurs
E
n
(H1/R
e
)
0.4/0.6
=
0.67
0.4/0.4
=
1.0
E
n
(H2/R
e
)
0.2/0.6
=
0.33
0.0/0.4
=
0.0
Total
1.0
1.0