Digital Signal Processing Reference
In-Depth Information
If ( 1.71 )and( 1.72 ) are satisfied, we say that the events are statistically independent
(i.e., the occurrence of the one event does not affect the occurrence of the other event).
From ( 1.67 ), we write:
P f AB g
PfBg ¼ PfAg;
PfAjBg¼
(1.73)
P f AB g
PfAg ¼ PfBg:
PfBjAg¼
(1.74)
From ( 1.73 ) and ( 1.74 ), it follows that the joint probability of two statistically
independent events is equal to the product of the two corresponding probabilities:
PfABg¼PfAgPfBg:
(1.75)
, N .
If they are independent, then any one of them is independent of any event formed by
unions, intersections, or complements of the others [PAP65, p. 42].
As a consequence,
This result can be generalized to N mutually independent events A j , j ¼ 1,
...
it
is necessary that
for all combinations
i ,
j ,
k ,
1 i j k ,
...
, N ,
PfA i A j g¼PfA i gPfA j g
PfA i A j A k g¼PfA i gPfA j gPfA k g
PfA 1 :::A N Y
N
PfA i g
(1.76)
1
There are 2 N
N 1 of these conditions. [PEE93, p. 21].
Example 1.7.1 The system shown in Fig. 1.11 has two units in serial connection
such that both units must operate in order for the system to operate. The
probabilities of failure-free operations of units in time t , are p 1 and p 2 . The failure
of one unit does not depend on the failure of another unit. Find the probability that
only the first unit failed if the system has failed.
Solution The space S is subdivided into N ¼ 4 mutually exclusive events B j ,
j ¼ 1, ... ,4:
B 1 ¼f Both units operate g;
(1.77)
Fig. 1.11 Serial connection of units
Search WWH ::




Custom Search