Geoscience Reference
In-Depth Information
Effects on the variance of translation and change of scale:
for any real c,
VX
ð
þ
c
Þ
¼
VX
ðÞ
and
c
2
VX
VcX
ðÞ
¼
:
A.5 Joint Distributions
The joint distribution of X = (X
1
,
, p),
is the probability distribution p
X
on (R
n
,B
n
) (where B
n
denotes the sigma algebra
generated by the products of intervals) determined by
…
,X
n
), a vector of random variables on (S,
ʛ
p
X
I
1X
...
X
I
n
ð
Þ
¼
p
ð
X
1
2
I
1
;
...
;
X
n
2
I
n
Þ;
for any set of intervals (I
1
,
,I
n
).
In the same way, the concepts of joint cumulative distribution function and of
joint density extend the one-dimensional case. In the context of joint distributions
of vectors X, the distribution of each one-dimensional random variable X
i
, is called
a marginal distribution. In the same way, its cdf is called a marginal cdf and its
density is called a marginal density.
A vector of random variables X = (X
1
,
…
,X
n
) has a continuous distribution if
and only if there is a positive function if
X
such that the, for F
X
the joint cumulative
distribution function of X,
…
x
1
x
n
F
X
ð
x
1
;
...
;
x
n
Þ
¼
f
X
x
1
...
ð
x
n
Þ
dx
1
...
dx
n
...
1
1
The following concepts, of correlation and independence between random
variables, help to understand joint probability distributions.
Covariance of the pair of random variables X and Y is the expected value of the
product of their deviations to the respective means:
Cov X
ð
;
Y
Þ
¼
EX
½
ð
EX
Þ
ð
Y
EY
Þ
So, if values above (or below) their expected values tend to occur together, then
X and Y have a positive covariance. If values above the expected value for one of
them tend to be accompanied by values below the expected value for the other then
they have a negative covariance.
Cov X
ð
;
Y
Þ
¼
EXY
ðÞ
EX
ðÞ
EY
ðÞ
and
ð
þ
Þ
¼
ðÞþ
ðÞþ
ð
;
Þ
VX
Y
VX
VY
2Cov X
Y
Search WWH ::
Custom Search