Abstract:
The problem considered is the computation reduction for general delete-d jackknife (Wu, 8) in linear models as well as the i. i. d, case. The delete-d jackknife was proved to have a heteroscedasticity-robustness property for variance estimation in linear models (Shao and Wu, 9) and other desirable asymptotic properties which are not shared by the traditional delete-l jackknife (Shao and Wu, 11; Wu, 12). Since the delete-d jackknife is based on \binomnd recomputations of \hat\theta_\mathbfs, where
n is the sample size and \hat\theta_\mathbfs is the analogue of \hat\theta based on the original sample with
d observations deleted, the number of computations increases rapidly as
n and
d increase. Via sampling techniques, a shortcut can be achieved by evaluating m \hat\theta_\mathbfs’s randomly drawn from all the \hat\theta_\mathbfs’s. The resulting jackknife-sampling hybrid variance estimators (JSVE) are shown to possess all the asymptotic properties that the delete-d jackknife variance estimators have, as long as
m=
n. If
m-1=
o(
n-1), the increase in mean squared earor by using JSVE is relatively negligible. The number of computations, however, is considerably reduced. The small sample performance and a comparison of several JSVE are studied in a simulation study.