Geoscience Reference
In-Depth Information
Yue et al. (2002a) investigated the interaction between a linear trend and
a lag-one autoregressive [AR(1)] model using Monte Carlo simulation.
Simulation analysis indicated that the existence of serial correlation alters the
variance of the Mann-Kendall (MK) statistic estimate, and the presence of a
trend alters the magnitude of serial correlation. Furthermore, it was found that
the commonly used pre-whitening procedure for eliminating the effect of
serial correlation on the MK test leads to inaccurate assessments of the
significance of a trend. Therefore, it was suggested that firstly trend should be
removed prior to ascertaining the magnitude of serial correlation. Both the
suggested approach and the existing approach were employed to assess the
significance of a trend in the serially correlated annual mean and annual
minimum streamflow data of some pristine river basins in Ontario, Canada. It
was concluded that the researchers might have incorrectly identified the
possibility of significant trends by using the already existing approach.
Yue et al. (2002b) studied the efficacy of the two nonparametric rank-
based statistical tests (the Mann-Kendall test and Spearman's rho test) by
Monte Carlo simulation. These two tests were used to assess the significance
of trends in annual maximum streamflow data of 20 pristine basins in Ontario,
Canada. The results indicated that their effectiveness depends on the pre-
assigned significance level, magnitude of trend, sample size, and the amount
of variation within a time series. Thus, the bigger the absolute magnitude of
trend or larger the sample size, the more powerful are the tests; but as the
amount of variation in a time series increases, the power of the tests decreases.
When a trend is present, the power is also dependent on the distribution type
and skewness of the time series. It was also found that these two tests have
practically similar power in detecting a trend.
Clarke (2002) described a model in which the Gumbel distribution has a
(possibly) time-variant mean. The time-trend in mean value was determined
by a single parameter E estimated by Maximum Likelihood (ML). The large-
sample variance of the ML estimate was compared with the variance of the
trend calculated by linear regression; the latter was found to be 64% greater.
The simulated samples from a standard Gumbel distribution were given
superimposed linear trends of different magnitudes, and the efficacy of three
trend-testing methods viz., Maximum Likelihood, Linear Regression, and the
nonparametric Mann-Kendall test was compared. The ML test was found
always more powerful than the Linear Regression or Mann-Kendall test
regardless of the value (positive) of the trend E; the MK test was found least
powerful for all the values of E.
Ducré-Robitaille et al. (2003) evaluated eight homogenization techniques
for the detection of discontinuities in the temperature series using simulated
datasets reproducing a vast range of possible situations. The simulated data
represented homogeneous series and the series having one or more steps.
Although the majority of the techniques considered in this study performed
Search WWH ::




Custom Search