Oxford: Oxford University Press, 2001. — 195 p.
This book provides an introduction to the modern theory of likelihood-based statistical inference. This theory is characterized by several important features. One is the recognition that it is desirable to condition on relevant ancillary statistics. Another is that probability approximations are based on saddlepoint and closely related approximations that generally have very high accuracy. A third aspect is that, for models with nuisance parameters, inference is often based on marginal or conditional likelihoods, or approximations to these likelihoods. These methods have been shown often to yield substantial improvements over classical methods. The book also provides an up-to-date account of recent results in the field, which has been undergoing rapid development.
Some Basic Concepts
Large-sample Approximations
Likelihood
First-order Asymptotic Theory
High-order Asymptotic Theory
Asymptotic Theory and Conditional Inference
The Signed Likelihood Ratio Statistic
Likelihood Functions for A Parameter of Interest
The Modified Profile Likelihood Function
Appendix: Data Set Used in the Examples