Зарегистрироваться
Восстановить пароль
FAQ по входу

Tsybakov A.B. Introduction to Nonparametric Estimation

  • Файл формата pdf
  • размером 2,79 МБ
  • Добавлен пользователем
  • Описание отредактировано
Tsybakov A.B. Introduction to Nonparametric Estimation
Springer, 2009. — 221 p.
The tradition of considering the problem of statistical estimation as that of estimation of a finite number of parameters goes back to Fisher. However, parametric models provide only an approximation, often imprecise, of the underlying statistical structure. Statistical models that explain the data in a more consistent way are often more complex: Unknown elements in these models are, in general, some functions having certain properties of smoothness. The problem of nonparametric estimation consists in estimation, from the observations, of an unknown function belonging to a sufficiently large class of functions.
The theory of nonparametric estimation has been considerably developed during the last two decades focusing on the following fundamental topics:
(1) methods of construction of the estimators
(2) statistical properties of the estimators (convergence, rates of convergence)
(3) study of optimality of the estimators
(4) adaptive estimation.
Basic topics (1) and (2) will be discussed in Chapter 1, though we mainly focus on topics (3) and (4), which are placed at the core of this book. We will first construct estimators having optimal rates of convergence in a minimax sense for different classes of functions and different distances defining the risk. Next, we will study optimal estimators in the exact minimax sense presenting, in particular, a proof of Pinsker’s theorem. Finally, we will analyze the problem of adaptive estimation in the Gaussian sequence model. A link between Stein’s phenomenon and adaptivity will be discussed.
This book is an introduction to the theory of nonparametric estimation. It does not aim at giving an encyclopedic covering of the existing theory or an initiation in applications. It rather treats some simple models and examples in order to present basic ideas and tools of nonparametric estimation. We prove, in a detailed and relatively elementary way, a number of classical results that are well-known to experts but whose original proofs are sometimes neither explicit nor easily accessible. We consider models with independent observations only; the case of dependent data adds nothing conceptually but introduces some technical difficulties.
Nonparametric estimators
Examples of nonparametric models and problems
Kernel density estimators
Fourier analysis of kernel density estimators
Unbiased risk estimation. Cross-validation density estimators
Nonparametric regression. The Nadaraya–Watson estimator
Local polynomial estimators
Projection estimators
Generalizations
Oracles
Unbiased risk estimation for regression
Three Gaussian models
Notes
Exercises
Lower bounds on the minimax risk
A general reduction scheme
Lower bounds based on two hypotheses
Distances between probability measures
Lower bounds on the risk of regression estimators at a point
Lower bounds based on many hypotheses
Lower bounds in L2
Lower bounds in the sup-norm
Other tools for minimax lower
Notes
Exercises
Asymptotic efficiency and adaptation
Pinsker’s theorem
Linear minimax lemma
Proof of Pinsker’s theorem
Upper bound on the risk
Lower bound on the minimax risk
Stein’s phenomenon
Unbiased estimation of the risk
Oracle inequalities
Minimax adaptivity
Unadmissibility of the Pinsker estimator
Notes
Exercises
Appendix.
  • Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
  • Регистрация