Databases Reference
In-Depth Information
where
P
(
A
,
B
)
is the probability of the event
A and
the event
B
occurring. Similarly,
P
(
A
,
B
)
P
(
B
|
A
)
=
(A.2)
P
(
A
)
Combining (1) and (2) we get
(
|
)
(
)
P
B
A
P
A
P
(
A
|
B
)
=
(A.3)
(
)
P
B
If the events
A
and
B
do not provide any information about each other, it would be
reasonable to assume that
P
(
A
|
B
)
=
P
(
A
)
and therefore from (1),
P
(
A
,
B
)
=
P
(
A
)
P
(
B
)
(A.4)
Whenever (4) is satisfied, the events
A
and
B
are said to be
statistically independent
,orsimply
independent
.
Example A.1.1:
A very common channel model used in digital communication is the
binary symmetric channel
.
In this model the input is a random experiment with outcomes 0 and 1. The output of the
channel is another random event with two outcomes 0 and 1. Obviously, the two outcomes are
connected in some way. To see how, let us first define some events:
A
: Input is 0
B
: Input is 1
C
: Output is 0
D
: Output is 1
Let's suppose the input is equally likely to be a 1 or a 0. So
P
(
A
)
=
P
(
B
)
=
0
.
5. If the
channel is perfect, that is, you get out of the channel what you put in, then we have
P
(
C
|
A
)
=
P
(
D
|
B
)
=
1
and
P
(
C
|
B
)
=
P
(
D
|
A
)
=
0
With most real channels this system is seldom encountered, and generally there is a small
probability
that the transmitted bit will be received in error. In this case, our probabilities are
P
(
C
|
A
)
=
P
(
D
|
B
)
=
1
−
P
(
C
|
B
)
=
P
(
D
|
A
)
=
? These are simply the probability that at any given
time the output is a 0 or a 1. How would we go about computing these probabilities given
the available information? Using (
A.1
) we can obtain
P
How do we interpret
P
(
C
)
and
P
(
D
)
(
A
,
C
)
and
P
(
B
,
C
)
from
P
(
C
|
A
)
,