Academic Press, 1972. — 195 p.
For the past ten years, my professional interests have focused on various aspects of regression. It has been my experience that the pseudoinverse is a great unifying concept. It has helped me to understand, remember, and explain many classical results in statistical theory as well as to discover (and rediscover) some new ones.
This book was conceived as a hybrid monograph-textbook. As a text, it would be suitable for the equivalent of a two-quarter course. In teaching such a course, one could fill in the remainder of the year with additional material on (for example) multiple regression, nonlinear regression, large sample theory, and optimal experimental design for regression. For this purpose I have attempted to make the development didactic. On the other hand, most of the material comes from reasonably current periodical literature and a fair amount of the material is my own work (some already published, some not). Virtually all of the material deals with regression either directly (Chapters VI-IX) or as background (Chapters I-V). By restricting the domain of discourse we are able to pursue a leisurely pace and, I hope, to preserve a sense of unity throughout.
At the time the manuscript was completed there were, to my knowledge, no textbook treatments of the pseudoinverse. Since that time, two excellent complementary monographs have appeared containing treatments of the Moore-Penrose pseudoinverse in a more general setting. The first (Boullion and Odell) appeared in early 1971 and concerns itself mainly with algebraic and structural properties of these pseudoinverses. The second (Rao and Mitra) appeared later in 1971 and is extremely comprehensive in its coverage of the then extant pseudoinverse literature. Both volumes contain large bibliographies.
The General Theory and Computational MethodsGeneral Background Material
Geometric and Analytic Properties of the Moore-Penrose Pseudoinverse
Pseudoinverses of Partitioned Matrices and Sums and Products of Matrices
Computational Methods
Statistical ApplicationsThe General Linear Hypothesis
Constrained Least Squares, Penalty Functions, and BLUE's
Recursive Computation of Least Squares Estimators
Nonnegative Definite Matrices, Conditional Expectation, and Kalman Filtering