Information Technology Reference
In-Depth Information
generally improve as more methods are included in the combination. This is shown
in the following example.
Let the forecasts f 1 , f 2 , f 3 , ..., f k , of the random variable z be given and let them
be linearly combined to give the resulting forecast f c , defined as
k
f
¦
wf z
(),
c
i
i
i
1
where
w i = 1, 2, …, k , are the assigned weights to the individual forecasts. The
main problem is how to select the individual weights optimally. The simplest way
would be to select an equal weighted combination based on the arithmetic average
of the individual forecasts. This has proven to be relatively robust and accurate,
which is evident when two unbiased forecasts f 1 and f 2 of a given time series are
linearly combined as
,
f
,
f
(1
kf
)
c
1
2
which will have a minimum mean square error for suitably chosen k . The
corresponding forecast errors for the combination, e c , is defined using the
individual errors e 1 and e 2 as
e e
.
(1
ke
)
c
1
2
For the two mutually independent forecast errors the value
kE
2
E
2
E
2
|
2
2
2
e
e
e
e
e
e
2
1
2
2
1
2
delivers the minimum value of
,
e being the local estimate of the expected
E e
2
c
2
error squared.
Anyhow, the linear combination of forecasts is not likely to be the appropriate
in forecasting practice, as the following example shows, in which k different
forecast methods are given, the i th individual forecast having an information set { I i
: I c , I si }, I c being the common part of the information used by all k models and I si
the special information for the i th forecast only. Denoting the i th forecast by f i =
F i ( I i ), the linear combination of forecasts can be expressed as
F
¦
()
ii i
FI
,
c
where w i is the weight of the i th forecast. On the other hand, every individual
forecasting model given can also be regarded as a subsystem for information
processing, while the combination method
f
FI
(
,
I
,...,
I
)
c
c
1
2,
k
Search WWH ::




Custom Search