Springer, 2017. — 256 p. — ISBN10: 9811067031, 13 978-9811067037.
This book reports on the latest advances in concepts and further developments of principal component analysis (PCA), addressing a number of open problems related to dimensional reduction techniques and their extensions in detail. Bringing together research results previously scattered throughout many scientific journals papers worldwide, the book presents them in a methodologically unified form. Offering vital insights into the subject matter in self-contained chapters that balance the theory and concrete applications, and especially focusing on open problems, it is essential reading for all researchers and practitioners with an interest in PCA.
Zhenfang Hu, Gang Pan, Yueming Wang and Zhaohui Wu
Sparse Principal Component Analysts via Rotation and Truncation
Aloke Datta, Susmita Ghosh and Ashish Ghosh
PCA, Kernel PCA and Dimensionality Reduction in Hyperspectral Images
Marco Geraci and Alessio Farcomeni
Principal Component Analysis in the Presence of Missing Data
Jiyong Oh and Nojun Kwak
Robust PCAs and PCA Using Generalized Mean
Salahеddin Alakkari and John Dingliana
Principal Component Analysis Techniques for Visualization of Volumetric Data
Panos P. Markopoulos, Sandipan Kundu, Shubham Chamadia, Nicholas Tsagkarakis and Diminis A. Pados
Outlier-Resistant Data Processing with L1-Norm Principal Component Analysis
Francesc Pozo and Yolanda Vidal
Damage and Fault Detection of Structures Using Principal Component Analysis and Hypothesis Testing
Meng Lu, Kai He, Jianhua Z. Huang and Xiaoning Qian
Principal Component Analysis for Exponential Family Data
Yannick Dеville, Charlotte Revel, Vеronique Achard and Xavier Briottet
Application and Extension of PCA Concepts to Blind Unmixing of Hyperspectral Data with Intra-class Variability