McGraw-Hill Book Company. 2004. — 253 p. — ISBN: 047168029X.
Statistical decision theory and methods of Bayesian statistical inference have been both intensively and extensively developed during the past twenty years. A unified theory has been constructed during this period, and the concepts and methods have been widely applied to problems in the areas of engineering and communications, economics and management, psychology and behavioral science, and systems and operations research. Because of these developments, interest in decision theory and its applications has greatly increased at all mathematical levels. The purpose of this book is to provide, at an advanced undergraduate or beginning graduate level, a thorough course in the theory and methodology of optimal statistical decisions.
The book is intended for students in the areas of application mentioned above, as well as for students in statistics and mathematics. Throughout the book, expository discussions are presented to ease the reader's path through the technical material. Complete proofs and derivations are given for almost all theoretical results, but these results are usually introduced or followed by explanations and examples.
Contents
Survey of probability theoryExperiments, samples spaces, and probability
Random variables, random vectors, and distribution functions
Some special univariate distributions
Some special multivariate distributions
Subjective probability and utilitySubjective probability
Utility
Statistical decision problemsDecision problems
Conjugate prior distributions
Limiting posterior distributions
Estimation, testing hypotheses, and linear statistical models
Sequential decisionsSequential sampling
Optimal stopping
Sequential choice of experiments