Wiley, 1995. — 449 p.
The first seven chapters of this book were developed over a period of about 20 years for the course Linear Statistical Models at Michigan State University. They were first distributed in longhand (those former students may still be suffering the consequences), then typed using a word processor some eight or nine years ago. The last chapter, on frequency data, is the result of a summer course, offered every three or four years since 1980.
Linear statistical models are mathematical models which are linear in the unknown parameters, and which include a random error term. It is this error term which makes the models statistical. These models lead to the methodology usually called multiple regression or analysis of variance, and have wide applicability to the physical, biological, and social sciences, to agriculture and business, and to engineering.
The linearity makes it possible to study these models from a vector space point of view. The vectors Y of observations are represented as arrays written in a form convenient for intuition, rather than necessarily as column or row vectors. The geometry of these vector spaces has been emphasized because the author has found that the intuition it provides is vital to the understanding of the theory. Pictures of the vectors spaces have been added for their intuitive value. In the author’s opinion this geometric viewpoint has not been sufficiently exploited in current textbooks, though it is well understood by those doing research in the field. For a brief discussion of the history of these ideas see Herr (1980).
Bold print is used to denote vectors, as well as linear transformations. The author has found it useful for classroom boardwork to use an arrow notation above the symbol to distinguish vectors, and to encourage students to do the same, at least in the earlier part of the course. Students studying these notes should have had a one-year course in probability and statistics at the post-calculus level, plus one course on linear algebra. The author has found that most such students can handle the matrix algebra used here, but need the material on inner products and orthogonal projections introduced in Chapter 1.