New York: Springer, 2022. — 279 p.
This book proposes tools for analysis of multidimensional and metric data, by establishing a state-of-the-art of the existing solutions and developing new ones. It mainly focuses on visual exploration of these data by a human analyst, relying on a 2D or 3D scatter plot display obtained through Dimensionality Reduction.
Performing diagnosis of an energy system requires identifying relations between observed monitoring variables and the associated internal state of the system. Dimensionality reduction, which allows to represent visually a multidimensional dataset, constitutes a promising tool to help domain experts to analyse these relations. This book reviews existing techniques for visual data exploration and dimensionality reduction such as tSNE and Isomap, and proposes new solutions to challenges in that field.
In particular, it presents the new unsupervised technique ASKI and the supervised methods ClassNeRV and ClassJSE. Moreover, MING, a new approach for local map quality evaluation is also introduced. These methods are then applied to the representation of expert-designed fault indicators for smart-buildings, I-V curves for photovoltaic systems and acoustic signals for Li-ion batteries.
Foreword
Acknowledgements
Funding
Acronyms
Nomenclature
Metric Data
Dimensionality Reduction
Neighbourhood Characterization
Class-Information
List of Figures
List of Tables
Data Science Context
Data in a Metric Space
Measuring Dissimilarities and Similarities
Neighbourhood Ranks
Embedding Space Notations
Multidimensional Data
Sequence Data
Network Data
A Few Multidimensional Datasets
Geometric Datasets
Real Datasets
Automated Tasks
Underlying Distribution
Category Identification
Clusters and Flat Clustering
Outliers and Outlier Detection
Hierarchies and Hierarchical Clustering
Data Manifold Analysis
Latent Variables Extraction and Manifold Learning
Continua and Topology Learning
Model Learning
Classification
Regression
Visual Exploration and Visual Encoding
Human in the Loop Using Graphic Variables
Spatialization and Gestalt Principles
Scatter Plots
D and Interactive D Scatter Plots
Circular Background and Procrustes Transform
Glyphs
Scatter Plot Matrices (SPLOM)
Grand Tour
Parallel Coordinates
Colour Coding
Colour Models
Taxonomy of Colour Maps
Multiple Coordinated Views and Visual Interaction
Graph Drawing
Intermediate Conclusions
Intrinsic Dimensionality
Curse of Dimensionality
Data Sparsity
Norm Concentration
Estimating Intrinsic Dimensionality
Covariance-Based Approaches
Scree Plot
Local Covariance Dimension
Fractal Approaches
Correlation Dimension
Nearest Neighbours Dimension
Towards Local Estimation
Hidalgo
Hill Estimator
Two-Nearest Neigbhours Intrinsic Dimensionality LocalEstimator
Gaussian Mixture Modelling
Test of TIDLE on a Two Clusters Case
TIDLE Perspectives
Map Evaluation
Objective and Practical Indicators
Subjectivity of Indicators
User Studies on Specific Tasks
Unsupervised Global Evaluation
Types of Distortions
Distance Distortions
Rank Distortions
Link Between Distortions and Mapping Continuity
Reasons of Distortions Ubiquity
Scalar Indicators
Distance-Based Indicators
Rank-Based Indicators
Aggregation
Indicators Aggregation
Scale Aggregation
Diagrams
Shepard Diagram
Co-Ranking Matrix
Class-Aware Indicators
Class Separation and Aggregation
Class Separation
Classification Accuracy in the Map
Confusion Matrix
Class Aggregation
Comparing Scores Between the Two Spaces
Class Cohesion and Distinction
The Case of One Cluster per Class
Intermediate Conclusions
Map Interpretation
Axes Recovery
Linear Case: Biplots
Non-Linear Case
Local Evaluation
Point-Wise Aggregation
Point Markers
Background Colouring
One to Many Relations with Focus Point
Many to Many Relations
Matrix View
Graph Display
Map Interpretation Using Neighbourhood Graphs
Uniform Formulation of Rank-Based Indicators
MING Graphs
MING Analysis for a Toy Dataset
Impact of MING Parameters
Digits Interpretation
Impact of the Scale
Impact of the Distortion Measure
Visual Clutter
Interactive Edge Filtering
Edge Bundling
Oil Flow
COIL- Dataset
MING Perspectives
Stress Functions for Unsupervised Dimensionality Reduction
Spectral Projections
Principal Component Analysis
Variance Interpretation
Reconstruction Error
Latent Variable Model
Classical MultiDimensional Scaling
Limitations of Linear Methods
Kernel Methods: Isompap, KPCA, LE
Kernel PCA
Isomap
Laplacian Eigenmap
Locally Linear Embedding
Non-Linear MultiDimensional Scaling
Metric MultiDimensional Scaling
Sammon Non-Linear Mapping and Curvilinear Component Analysis
Local MultiDimensional Scaling
Data-Driven High Dimensional Scaling
Non-Metric MultiDimensional Scaling
RankVisu
Neighbourhood Embedding Methods
General Principle: SNE
Scale Setting
Divergence Choice: NeRV and JSE
Symmetrization
Solving the Crowding Problem: tSNE
Kernel Choice
Adaptive Student Kernel Imbedding
Graph Layout
Force Directed Graph Layout: Elastic Embedding
Probabilistic Graph Layout: LargeVis
Topological Method UMAP
Artificial Neural Networks
Auto-Encoders
IVIS
Intermediate Conclusions
Stress Functions for Supervised Dimensionality Reduction
Types of Supervision
Full Supervision
Weak Supervision
Semi-Supervision
Parametric with Class Purity
Linear Discriminant Analysis
Neighbourhood Component Analysis
Metric Learning
Mahalanobis Distances
Riemannian Metric
Direct Distances Transformation
Additive Transformation
Multiplicative Transformation
Concave vs Convex Transformations
Similarities Learning
Metric Learning Limitations
Class Adaptive Scale
Class-Guiding Principle: Classimap
Class-Guided Neighbourhood Embedding
ClassNeRV Stress
Flexibility of the Supervision
Ablation Study
Comparison with Other Dimensionality Reduction Methods
Isolet Case Study
Robustness to Class Misinformation
Extension to the Type Mixture: ClassJSE
Extension to Semi-Supervision and Weak-Supervision
Extension to Soft Labels
Intermediate Conclusions
Optimization, Acceleration and Out of Sample Extensions
Optimization
Global and Local Optima
Gradient Descent and Quasi-Newton Methods
Initialization
Multi-Scale Optimization
Force-Directed Placement Interpretation
Elastic and Plastic Behaviours
Stochastic Gradient Descent
Attractive-Repulsive Decomposition
Blockade Effect
Auxiliary Dimensions
Acceleration Strategies
Attractive Forces Approximation
Binary Search Trees
Repulsive Forces
Forces Sampling
Forces Aggregation
Landmarks Approximation
Out of Sample Extension
Applications
Mapping Acceleration
Mapping Interaction
Incremental Positioning: Data Streams
Classification in the Map
Parametric Case: Model-Constrained Mapping
Spectral Projection Methods
Non-parametric Stress with Neural Network Model
Non-parametric Case
Local Linear Transformations: LAMP
Manifold Reconstruction
Radial Basis Functions Interpolation
Intermediate Conclusions
Applications of Dimensionality Reduction to the Diagnosis of Energy Systems
Smart Buildings Commissioning
System and Rules
Mapping
Photovoltaics
I–V Curves
Comparing Normalized I–V Curves
Colour Description of the Chemical Compositions
Batteries
Case
Case
Conclusions
Some Technical Results
Equivalence Between Triangle Inequality and Convexity of Balls for a Pseudo-Norm
From Pareto to Exponential Distribution
Spiral and Swiss roll
Kullback–Leibler Divergence
Generalized Kullback–Leibler Divergence
Perplexity with Hard Neighbourhoods
Link Between Soft and Hard Recall and Precision
Details of Calculations
General Gradient of Stress Function
Neighbourhood Embedding
Supervised Neighbourhood Embedding (Asymmetric Case)
Mixtures
Type
Type
Asymptotic Behaviour
Membership Degrees
Soft-Min Arguments
Gaussian Kernel
Student Kernel
Asymptotic Behaviour
Scale Setting by Perplexity
Force Interpretation
Spectral Projections Algebra
PCA as Matrix Factorization and SVD Resolution
Link with Linear Projection
Sparse Expression
PCA and Centering: From Affine to Linear
Link with Covariance and Gram Matrices
From Distances to Gram Matrix
Probabilistic Interpretation and Maximum Likelihood
Nyström Approximation
Conflict of Interest Statement
Disclaimer Statement