2nd corrected printing. — Springer, 1996. — 517 p. — ISBN: 978-1461269038.
The jackknife and bootstrap are the most popular data-resampling methods used in statistical analysis. This book provides a systematic introduction to the theory of the jackknife, bootstrap and other resampling methods that have been developed in the last twenty years. It aims to provide a guide to using these methods which will enable applied statisticians to feel comfortable in applying them to data in their own research. The authors have included examples of applying these methods in various applications in both the independent and identically distributed (iid) case and in more complicated cases with non-iid data sets. Readers are assumed to have a reasonable knowledge of mathematical statistics and so this will be made suitable reading for graduate students, researchers and practitioners seeking a wide-ranging survey of this important area of statistical theory and application.
Statistics and Their Sampling Distributions, The Traditional Approach,
The Jackknife, The Bootstrap, Extensions to Complex Problems, Scope of Our Studies.
Theory for the Jackknife.
Variance Estimation for Functions of Means, Variance Estimation for Functionals.
The Delete-d Jackknife, Other Applications: Bias estimation, Bias reduction, Miscellaneous results.
Theory for the Bootstrap.
Techniques in Proving Consistency.
Consistency: Some Major Results, Accuracy and Asymptotic Comparisons.
Fixed Sample Performance, Smoothed Bootstrap, Nonregular Cases.
Bootstrap Confidence Sets and Hypothesis Tests.
Bootstrap Confidence Sets, Asymptotic Theory,
The Iterative Bootstrap and Other Methods.
Empirical Comparisons, Bootstrap Hypothesis Tests.
Computational Methods.
The Delete-1 Jackknife, The Delete-d Jackknife, Analytic Approaches for the Bootstrap, Simulation Approaches for the Bootstrap.
Applications to Sample Surveys.
Sampling Designs and Estimates, Resampling Methods, Comparisons by Simulation, Asymptotic Results, Resampling Under Imputation.
Applications to Linear Models.
Linear Models and Regression Estimates, Variance and Bias Estimation, Inference and Prediction Using the Bootstrap, Model Selection, Asymptotic Theory.
Applications to Nonlinear, Nonparametric, and Multivariate Models.
Nonlinear Regression, Generalized Linear Models, Cox's Regression Models, Kernel Density Estimation, Nonparametric Regression, Multivariate Analysis.
Applications to Time Series and Other Dependent Data.
m-Dependent Data, Markov Chains, Autoregressive Time Series, Other Time Series, Stationary Processes.
Bayesian Bootstrap and Random Weighting.
Bayesian Bootstrap, Random Weighting, Random Weighting for Functionals and Linear Models, Empirical Results for Random Weighting.
Appendix A. Asymptotic Results.
A.1 Modes of Convergence.
A.2 Convergence of Transformations.
A.3 O(·), o(·), and Stochastic O(·), o(·).
A.4 The Borel-Cantelli Lemma.
A.5 The Law of Large Numbers.
A.6 The Law of the Iterated Logarithm.
A.7 Uniform Integrability.
A.8 The Central Limit Theorem.
A.9 The Berry-Esseen Theorem.
A.10 Edgeworth Expansions.
A.11 Cornish-Fisher Expansions.
Appendix B. Notation.