Multiple Random Variables (GPS)

Navigation systems frequently involve multiple random variables. For example, a vector of simultaneous measurements y might be modeled as

tmp3AC614_thumb

where y represents the signal portion of the measurement and v represents a vector of random measurement errors.

In the case of multiple random variables, we are often concerned with how the values of the elements of the random vector relate to each other. When the values are related, then one of the random variables may be useful for estimating the value of the other. Important questions include: how to quantify the interrelation between random variables, how to optimally estimate the value of one random variable when the value of the other is known, and how to quantify the accuracy of the estimate. In this type of analysis, the multivariate density and distribution and the second order statistics referred to as correlation and covariance are important. The discussion of basic properties will focus on two random variables, but the concepts extend directly to higher dimensional vectors.

Basic Properties

Let v and w be random variables. The joint probability distribution function of v and w is


tmp3AC615_thumb

The joint distribution has the following properties:

tmp3AC616_thumb

 

 

tmp3AC617_thumb

The last two properties state the the joint distribution is nondecreasing in both arguments.

The joint probability density is defined as

tmp3AC618_thumb

The joint density has the following properties:

tmp3AC619_thumb

In the above,tmp3AC620_thumbare referred to as the marginal density of v andtmp3AC621_thumbrespectively. When the meaning is clear from the context, the subscripts may be dropped on either the distribution or the density.

Example 4.6 Let u and v be random variables with the joint probability density

tmp3AC624_thumb

Using the properties above, it is straightforward to show that

tmp3AC625_thumb

Statistics and Statistical Properties

Two random vector variables v and w are independent if

tmp3AC626_thumb

When two random variables are independent and have the same marginal densities, they are independent and identically distributed (i.i.d.).

Example 4.7 The random variables u and v in Example 4-6 are independent.

There is a result called the central limit theorem [31, 107] that states that if

tmp3AC627_thumb

where thetmp3AC628_thumbare independent random variables, then as N increases the distribution fortmp3AC629_thumbapproaches a Gaussian distribution, independent of the distributions for the individualtmp3AC630_thumbThis rather remarkable result motivates the the importance of Gaussian random variables in applications. Whenever a random effect is the superposition of many small random affects, the superimposed effect can be accurately modeled as a Gaussian random variable.

When two random variables are not independent, it is useful to have metrics to quantify the amount of interdependence. Two important metrics are correlation and covariance.

The correlation -matrix between two random variables v and w is defined by

tmp3AC634_thumb

The covariance matrix for two vector valued random variables v and w is defined by

tmp3AC635_thumb

The correlation coefficienttmp3AC636_thumbis a normalized measure of the correlation between the two scalar random variables v andtmp3AC637_thumbthat is defined as

tmp3AC640_thumb

The correlation coefficient always satisfiestmp3AC641_thumbWhen the magnitude of pvw is near one, then knowledge of one of the random variables will allow accurate prediction of the other. Whentmp3AC643_thumbthen v and w are said to be uncorrelated. Uncorrelated random variables havetmp3AC644_thumb andtmp3AC645_thumbIndependent random variables are uncorrelated,but uncorrelated random variables may or may not be independent. Two vector random variables v and w are uncorrelated iftmp3AC646_thumbTwo vector random variables v and w are orthogonal iftmp3AC647_thumb

Example 4.8 Consider the random variables

tmp3AC653_thumb

wheretmp3AC654_thumbis a uniform random variable. It is left to the reader to show that

tmp3AC656_thumb

These facts show that u and v are orthogonal and uncorrelated. However, it is also straightforward to show thattmp3AC658_thumbThe fact that the variables are algebraically related shows that u and v are not independent. A

Let the matrixtmp3AC659_thumbbe the covariance matrix for the vector x.

Then by eqn. (4.23), the element in the i-th row and j-th column of P istmp3AC662_thumb

Therefore, knowledge of the covariance matrix for a random vector allows computation of the variance of each component of the vector and of the correlation coefficients between elements of the vector. This fact is very useful in state estimation applications.

Example 4.9 An analyst is able to acquire a measurement y that is modeled as

tmp3AC663_thumb

The measurement is a function of two unknowns a and b and is corrupted by additive measurement noisetmp3AC664_thumbThe analyst has no knowledge of the value of a, but based on prior experience the analyst considers a reasonable model for b to betmp3AC665_thumbThe random variables b and n are assumed to be independent. Bothtmp3AC666_thumbare positive.

Based on this model and prior experience, the analyst chooses to estimate the values of a and b as

tmp3AC670_thumb

The parameter estimation errors aretmp3AC671_thumbWhat are the mean parameter estimation errors, the variance of the parameter estimation errors, and the covariance between the parameter estimation errors?

tmp3AC673_thumb

Based on the results of Steps 3—5 and eqn. (4.23), the correlation coefficient between

tmp3AC675_thumb

If we define

tmp3AC676_thumb

then based on the above analysis

tmp3AC677_thumb

which checks with eqn. (4.24).

Analysis similar to that of Example 4.9 will have utility in Part II for the initialization of navigation systems. Related to this example, the case where a is a vector and the case where y is a nonlinear function of a are considered in Exercise 4.11.

Vector Gaussian Random Variables

For the vector random variabletmp3AC678_thumbthe notationtmp3AC679_thumbis used to indicate that x has the multivariate Gaussian or Normal density function described by

tmp3AC682_thumb

Again, the density of the vector Normal random variable is completely described by its expected valuetmp3AC683_thumband its covariance matrixtmp3AC684_thumb

Transformations of Vector Random Variables

If v and w are vector random variables intmp3AC685_thumbrelated bytmp3AC686_thumbwhere g is invertible and differentiable with unique inversetmp3AC687_thumbthen the formula of eqn. (4.16) extends to

tmp3AC693_thumb

is the Jacobian matrix of g evaluated at W =tmp3AC694_thumb

where

tmp3AC695_thumb

Example 4.10 Find the density of the random variable y where y = Ax, tmp3AC696_thumband A is a nonsingular matrix. Examples such as this are important in state estimation and navigation applications. By eqn. (4.28),

tmp3AC698_thumb

which is equivalent to

tmp3AC699_thumb

This shows that

tmp3AC700_thumb

This result is considered further in Exercise 4.10.

As demonstrated in Examples 4.4 and 4.10, affine operations on Gaussian random variables yield Gaussian random variables. Nonlinear functions of Gaussian random variables do not yield Gaussian random variables.

Next post:

Previous post: