Graphics Programs Reference
In-Depth Information
4.7 Jackknife Estimates of the Regression Coeffi cients
The jackknife method is a resampling technique that is similar to the boot-
strap method. However, from a sample with n data points, n subsets with
n -1 data points are taken. Subsequently, the parameters of interest are cal-
culated, such as the regression coeffi cients. The mean and dispersion of the
coeffi cients are computed. The disadvantage of this method is the limited
number of n samples. The jackknife estimate of the regression coeffi cients
is therefore less precise in comparison to the bootstrap results.
MATLAB does not provide a jackknife routine. However, the correspond-
ing code is easy to generate:
for i = 1 : 30
% Define two temporary variables j_meters and j_age
j_meters = meters;
j_age = age;
% Eliminate the i-th data point
j_meters(i) = [];
j_age(i) = [];
% Compute regression line from the n-1 data points
p(i,:) = polyfit(j_meters,j_age,1);
end
The jackknife for n -1=29 data points can be obtained by a simple for loop.
Within each iteration, the i -th element is deleted and the regression coef-
fi cients are calculated for the i -th sample. The mean of the i samples gives
an improved estimate of the coeffi cients. Similar to the bootstrap result, the
slope of the regression line (fi rst coeffi cient) is clearly defi ned, whereas the
intercept with the y -axis (second coeffi cient) has a large uncertainty,
mean(p(:,1))
ans =
5.6382
compared to 5.6023+/-0.4421 and
mean(p(:,2))
ans =
1.0100
compared to 1.3366+/-4.4079 as calculated by the bootstrap method. The
true values are 5.6 and 1.2, respectively. The histogram of the jackknife
results from 30 subsamples
hist(p(:,1));
figure
hist(p(:,2));
Search WWH ::




Custom Search