Geoscience Reference
In-Depth Information
this formulation to calculate residuals gives a better indication of the performance
of the 3DVAR assimilation algorithm and how best to tune the background and
observation error statistics to improve the analysis. The NCODA 3DVAR system
routinely computes residual vectors while still in observation space and saves the
residual and innovation vectors for each update cycle in a diagnostics file. As noted,
a time history of the innovations and the residuals is the basic information needed to
compute a posteriori refinements to the 3DVAR statistical parameters. Analysis of
the innovations is the most common, and the most accurate, technique for estimating
observation and forecast error covariances and the method has been successfully
applied in practice (e.g. Hollingsworth and Lonnberg 1986 ). Similarly, a spatial
autocorrelation analysis of the residuals is used to determine if the analysis has
extracted all of the information in the observing system. Any spatial correlation
remaining in the residuals at spatial lags greater than zero represents information
that has not been extracted by the analysis and indicates an inefficient analysis
( Hollingsworth and Lonnberg 1989 ).
13.5.5
Internal Data Checks
Internal data checks are those quality control procedures performed by the analysis
system itself. These data consistency checks are best done within the assimilation
algorithm, since it requires detailed knowledge of the background and observa-
tion error covariances, which are available only when the assimilation is being
performed. The first step is to scale the innovations
.
y H
.
x b //
by the diagonal
/ 1=2 , the symmetric positive-definite covariance matrix of ( 13.1 ).
The elements of this scaled innovation vector ( ) should have a standard deviation
equal to 1 if the background and observation error covariances have been specified
correctly. Assuming this to be the case, set a tolerance limit (T L /
HP b H T
of
.
C R
to detect and reject
any observation that exceeds it. Since P b and R are never perfectly known, it is best
to use a relatively high tolerance limit (T L D 4:0
) to identify marginally acceptable
observations.
The second part of the internal data check is a consistency check. It compares
the marginally acceptable observations with all of the observations. The procedure
is a logical extension of the tolerance limit check described above. In the data
consistency test, the innovations are scaled by the full covariance matrix (not just the
diagonal). The elements of this scaled innovation vector ( d ) are also dimensionless
quantities normally distributed. However, because the scaling in d involves the
full covariance matrix, it includes correlations between all of the observations. By
comparing the vectors and d it can be shown ( Daley and Barker 2000 )which
marginally acceptable observations are inconsistent with other observations and
can therefore be rejected. The d metric should increase (decrease) with respect
to when that observation is inconsistent (consistent) with other observations, as
specified by the background and observation error statistics.
Search WWH ::




Custom Search