Geoscience Reference
In-Depth Information
2 Variational Methods
Successful and efficient minimization of the cost function (( 19.1 ) from main text)
requires the so-called Hessian preconditioning (e.g., Axelsson and Barker 1984 ;
Zupanski 1993 , 1995 ). The square-root forecast error covariance is commonly used
for this purpose in variational methods, introduced as a change of variable
D P 1=2
f
x x f
w
(19.26)
Where w is the preconditioned control variable. The iterative minimization solution
w a is obtained as a limit of sequence
f w k D
w k 1 C ˛ k 1 d k 1 I k D 1;2;::: g
where index
is descent direction.
After substituting the minimization solution w a in ( 19.26 ) one obtains the analysis
solution in terms of the physical state variable
k
is the iteration index,
˛
is step-size, and
d
x a x f D P 1=2
f
w a :
(19.27)
After substituting ( 19.21 )in( 19.27 ), and denoting
i D i v i
w a
(19.28)
The variational method solution ( 19.27 ) becomes
D X
i
x a x f
i u i :
(19.29)
Therefore, the variational solution can also be represented as a linear combination
of forecast error covariance singular vectors.
Since majority of currently used data assimilation algorithms are based on
KF and/or variational methods, one can see from ( 19.25 )to( 19.29 ) that analysis
correction
x a x f lies in the space defined by the forecast error covariance singular
vectors.
Appendix 2
Entropy and Mutual Information
We follow Cover and Thomas ( 2006 ) to quantify the information content of
observations, based on Shannon information theory ( Shannon and Weaver 1949 )
and relative entropy ( Kullback and Leibler 1951 ). Entropy of a random variable
X
is defined as a non-negative measure of uncertainty
Search WWH ::




Custom Search