新书报道
当前位置: 首页 >> 电类优秀教材 >> 正文
Multilinear Subspace Learning : Dimensionality Reduction of Multidimensional Data
发布日期:2015-09-17  浏览

Multilinear Subspace Learning : Dimensionality Reduction of Multidimensional Data

[Book Description]

Due to advances in sensor, storage, and networking technologies, data is being generated on a daily basis at an ever-increasing pace in a wide range of applications, including cloud computing, mobile Internet, and medical imaging. This large multidimensional data requires more efficient dimensionality reduction schemes than the traditional techniques. Addressing this need, multilinear subspace learning (MSL) reduces the dimensionality of big data directly from its natural multidimensional representation, a tensor. Multilinear Subspace Learning: Dimensionality Reduction of Multidimensional Data gives a comprehensive introduction to both theoretical and practical aspects of MSL for the dimensionality reduction of multidimensional data based on tensors. It covers the fundamentals, algorithms, and applications of MSL. Emphasizing essential concepts and system-level perspectives, the authors provide a foundation for solving many of today's most interesting and challenging problems in big multidimensional data processing. They trace the history of MSL, detail recent advances, and explore future developments and emerging applications. The book follows a unifying MSL framework formulation to systematically derive representative MSL algorithms. It describes various applications of the algorithms, along with their pseudocode. Implementation tips help practitioners in further development, evaluation, and application. The book also provides researchers with useful theoretical information on big multidimensional data in machine learning and pattern recognition. MATLAB(R) source code, data, and other materials are available at www.comp.hkbu.edu.hk/~haiping/MSL.html

 

[Table of Contents]

List of Figures                                    xiii
List of Tables xvii
List of Algorithms xix
Acronyms and Symbols xxi
Preface xxv
1 Introduction 1 (16)
1.1 Tensor Representation of 2 (3)
Multidimensional Data
1.2 Dimensionality Reduction via Subspace 5 (4)
Learning
1.3 Multilinear Mapping for Subspace 9 (2)
Learning
1.4 Roadmap 11 (3)
1.5 Summary 14 (3)
I Fundamentals and Foundations 17 (88)
2 Linear Subspace Learning for Dimensionality 19 (30)
Reduction
2.1 Principal Component Analysis 20 (4)
2.2 Independent Component Analysis 24 (3)
2.3 Linear Discriminant Analysis 27 (4)
2.4 Canonical Correlation Analysis 31 (4)
2.5 Partial Least Squares Analysis 35 (4)
2.6 Unified View of PCA, LDA, CCA, and PLS 39 (1)
2.7 Regularization and Model Selection 40 (3)
2.7.1 Regularizing Covariance Matrix 40 (1)
Estimation
2.7.2 Regularizing Model Complexity 41 (1)
2.7.3 Model Selection 42 (1)
2.8 Ensemble Learning 43 (2)
2.8.1 Bagging 43 (1)
2.8.2 Boosting 43 (2)
2.9 Summary 45 (1)
2.10 Further Reading 46 (3)
3 Fundamentals of Multilinear Subspace 49 (22)
Learning
3.1 Multilinear Algebra Preliminaries 50 (7)
3.1.1 Notations and Definitions 50 (3)
3.1.2 Basic Operations 53 (3)
3.1.3 Tensor/Matrix Distance Measure 56 (1)
3.2 Tensor Decompositions 57 (2)
3.2.1 CANDECOMP/PARAFAC 57 (1)
3.2.2 Tucker Decomposition and HOSVD 58 (1)
3.3 Multilinear Projections 59 (4)
3.3.1 Vector-to-Vector Projection 59 (2)
3.3.2 Tensor-to-Tensor Projection 61 (1)
3.3.3 Tensor-to-Vector Projection 61 (2)
3.4 Relationships among Multilinear 63 (1)
Projections
3.5 Scatter Measures for Tensors and Scalars 64 (4)
3.5.1 Tensor-Based Scatters 64 (3)
3.5.2 Scalar-Based Scatters 67 (1)
3.6 Summary 68 (1)
3.7 Further Reading 69 (2)
4 Overview of Multilinear Subspace Learning 71 (18)
4.1 Multilinear Subspace Learning Framework 72 (2)
4.2 PCA-Based MSL Algorithms 74 (2)
4.2.1 PCA-Based MSL through TTP 74 (2)
4.2.2 PCA-Based MSL through TVP 76 (1)
4.3 LDA-Based MSL Algorithms 76 (2)
4.3.1 LDA-Based MSL through TTP 77 (1)
4.3.2 LDA-Based MSL through TVP 77 (1)
4.4 History and Related Works 78 (3)
4.4.1 History of Tensor Decompositions 78 (1)
4.4.2 Nonnegative Matrix and Tensor 79 (1)
Factorizations
4.4.3 Tensor Multiple Factor Analysis and 80 (1)
Multilinear Graph-Embedding
4.5 Future Research on MSL 81 (5)
4.5.1 MSL Algorithm Development 81 (3)
4.5.2 MSL Application Exploration 84 (2)
4.6 Summary 86 (1)
4.7 Further Reading 86 (3)
5 Algorithmic and Computational Aspects 89 (16)
5.1 Alternating Partial Projections for MSL 90 (2)
5.2 Initialization 92 (4)
5.2.1 Popular Initialization Methods 92 (1)
5.2.2 Full Projection Truncation 93 (1)
5.2.3 Interpretation of Mode-n Eigenvalues 94 (1)
5.2.4 Analysis of Full Projection 95 (1)
Truncation
5.3 Projection Order, Termination, and 96 (1)
Convergence
5.4 Synthetic Data for Analysis of MSL 97 (2)
Algorithms
5.5 Feature Selection for TTP-Based MSL 99 (2)
5.5.1 Supervised Feature Selection 100 (1)
5.5.2 Unsupervised Feature Selection 101 (1)
5.6 Computational Aspects 101 (2)
5.6.1 Memory Requirements and Storage 101 (1)
Needs
5.6.2 Computational Complexity 102 (1)
5.6.3 MATLABョ Implementation Tips for 102 (1)
Large Datasets
5.7 Summary 103 (1)
5.8 Further Reading 104 (1)
II Algorithms and Applications 105 (100)
6 Multilinear Principal Component Analysis 107 (34)
6.1 Generalized PCA 108 (5)
6.1.1 GPCA Problem Formulation 108 (1)
6.1.2 GPCA Algorithm Derivation 109 (1)
6.1.3 Discussions on GPCA 110 (2)
6.1.4 Reconstruction Error Minimization 112 (1)
6.2 Multilinear PCA 113 (7)
6.2.1 MPCA Problem Formulation 114 (1)
6.2.2 MPCA Algorithm Derivation 114 (2)
6.2.3 Discussions on MPCA 116 (2)
6.2.4 Subspace Dimension Determination 118 (2)
6.2.4.1 Sequential Mode Truncation 119 (1)
6.2.4.2 Q-Based Method 119 (1)
6.3 Tensor Rank-One Decomposition 120 (4)
6.3.1 TROD Problem Formulation 120 (1)
6.3.2 Greedy Approach for TROD 121 (1)
6.3.3 Solving for the pth EMP 122 (2)
6.4 Uncorrelated Multilinear PCA 124 (7)
6.4.1 UMPCA Problem Formulation 124 (1)
6.4.2 UMPCA Algorithm Derivation 125 (5)
6.4.3 Discussions on UMPCA 130 (1)
6.5 Boosting with MPCA 131 (4)
6.5.1 Benefits of MPCA-Based Booster 132 (1)
6.5.2 LDA-Style Boosting on MPCA Features 132 (2)
6.5.3 Modified LDA Learner 134 (1)
6.6 Other Multilinear PCA Extensions 135 (6)
6.6.1 Two-Dimensional PCA 135 (1)
6.6.2 Generalized Low Rank Approximation 136 (1)
of Matrices
6.6.3 Concurrent Subspace Analysis 136 (1)
6.6.4 MPCA plus LDA 137 (1)
6.6.5 Nonnegative MPCA 137 (1)
6.6.6 Robust Versions of MPCA 137 (1)
6.6.7 Incremental Extensions of MPCA 138 (1)
6.6.8 Probabilistic Extensions of MPCA 138 (1)
6.6.9 Weighted MPCA and MPCA for Binary 139 (2)
Tensors
7 Multilinear Discriminant Analysis 141 (24)
7.1 Two-Dimensional LDA 142 (3)
7.1.1 2DLDA Problem Formulation 142 (1)
7.1.2 2DLDA Algorithm Derivation 143 (2)
7.2 Discriminant Analysis with Tensor 145 (2)
Representation
7.2.1 DATER Problem Formulation 145 (1)
7.2.2 DATER Algorithm Derivation 146 (1)
7.3 General Tensor Discriminant Analysis 147 (3)
7.4 Tensor Rank-One Discriminant Analysis 150 (3)
7.4.1 TR1DA Problem Formulation 150 (1)
7.4.2 Solving for the pth EMP 151 (2)
7.5 Uncorrelated Multilinear Discriminant 153 (9)
Analysis
7.5.1 UMLDA Problem Formulation 153 (1)
7.5.2 R-UMLDA Algorithm Derivation 154 (6)
7.5.3 Aggregation of R-UMLDA Learners 160 (2)
7.6 Other Multilinear Extensions of LDA 162 (3)
7.6.1 Graph-Embedding for Dimensionality 162 (1)
Reduction
7.6.2 Graph-Embedding Extensions of 163 (1)
Multilinear Discriminant Analysis
7.6.3 Incremental and Sparse Multilinear 164 (1)
Discriminant Analysis
8 Multilinear ICA, CCA, and PLS 165 (24)
8.1 Overview of Multilinear ICA Algorithms 166 (1)
8.1.1 Multilinear Approaches for ICA on 166 (1)
Vector-Valued Data
8.1.2 Multilinear Approaches for ICA on 166 (1)
Tensor-Valued Data
8.2 Multilinear Modewise ICA 167 (5)
8.2.1 Multilinear Mixing Model for Tensors 168 (1)
8.2.2 Regularized Estimation of Mixing 168 (1)
Tensor
8.2.3 MMICA Algorithm Derivation 169 (1)
8.2.4 Architectures and Discussions on 170 (1)
MMICA
8.2.5 Blind Source Separation on 171 (1)
Synthetic Data
8.3 Overview of Multilinear CCA Algorithms 172 (1)
8.4 Two-Dimensional CCA 173 (3)
8.4.1 2D-CCA Problem Formulation 173 (1)
8.4.2 2D-CCA Algorithm Derivation 174 (2)
8.5 Multilinear CCA 176 (8)
8.5.1 MCCA Problem Formulation 176 (2)
8.5.2 MCCA Algorithm Derivation 178 (6)
8.5.3 Discussions on MCCA 184 (1)
8.6 Multilinear PLS Algorithms 184 (5)
8.6.1 N-Way PLS 184 (1)
8.6.2 Higher-Order PLS 185 (4)
9 Applications of Multilinear Subspace 189 (16)
Learning
9.1 Pattern Recognition System 190 (1)
9.2 Face Recognition 191 (5)
9.2.1 Algorithms and Their Settings 192 (1)
9.2.2 Recognition Results for Supervised 193 (1)
Learning Algorithms
9.2.3 Recognition Results for 194 (2)
Unsupervised Learning Algorithms
9.3 Gait Recognition 196 (2)
9.4 Visual Content Analysis in Computer 198 (2)
Vision
9.4.1 Crowd Event Visualization and 198 (1)
Clustering
9.4.2 Target Tracking in Video 199 (1)
9.4.3 Action, Scene, and Object 199 (1)
Recognition
9.5 Brain Signal/Image Processing in 200 (2)
Neuroscience
9.5.1 EEG Signal Analysis 200 (1)
9.5.2 fMRI Image Analysis 201 (1)
9.6 DNA Sequence Discovery in Bioinformatics 202 (1)
9.7 Music Genre Classification in Audio 202 (1)
Signal Processing
9.8 Data Stream Monitoring in Data Mining 203 (1)
9.9 Other MSL Applications 204 (1)
Appendix A Mathematical Background 205 (14)
A.1 Linear Algebra Preliminaries 205 (8)
A.1.1 Transpose 205 (1)
A.1.2 Identity and Inverse Matrices 206 (1)
A.1.3 Linear Independence and Vector Space 206 (1)
Basis
A.1.4 Products of Vectors and Matrices 207 (2)
A.1.5 Vector and Matrix Norms 209 (1)
A.1.6 Trace 209 (1)
A.1.7 Determinant 210 (1)
A.1.8 Eigenvalues and Eigenvectors 211 (1)
A.1.9 Generalized Eigenvalues and 212 (1)
Eigenvectors
A.1.10 Singular Value Decomposition 212 (1)
A.1.11 Power Method for Eigenvalue 213 (1)
Computation
A.2 Basic Probability Theory 213 (2)
A.2.1 One Random Variable 213 (1)
A.2.2 Two Random Variables 214 (1)
A.3 Basic Constrained Optimization 215 (1)
A.4 Basic Matrix Calculus 215 (4)
A.4.1 Basic Derivative Rules 215 (1)
A.4.2 Derivative of Scalar/Vector with 216 (1)
Respect to Vector
A.4.3 Derivative of Trace with Respect to 216 (1)
Matrix
A.4.4 Derivative of Determinant with 217 (2)
Respect to Matrix
Appendix B Data and Preprocessing 219 (8)
B.1 Face Databases and Preprocessing 219 (3)
B.1.1 PIE Database 219 (1)
B.1.2 FERET Database 220 (1)
B.1.3 Preprocessing of Face Images for 220 (2)
Recognition
B.2 Gait Database and Preprocessing 222 (5)
B.2.1 USF Gait Challenge Database 222 (2)
B.2.2 Gait Silhouette Extraction 224 (1)
B.2.3 Normalization of Gait Samples 224 (3)
Appendix C Software 227 (4)
C.1 Software for Multilinear Subspace Learning 227 (1)
C.2 Benefits of Open-Source Software 228 (1)
C.3 Software Development Tips 228 (3)
Bibliography 231 (32)
Index 263

关闭


版权所有:西安交通大学图书馆      设计与制作:西安交通大学数据与信息中心  
地址:陕西省西安市碑林区咸宁西路28号     邮编710049

推荐使用IE9以上浏览器、谷歌、搜狗、360浏览器;推荐分辨率1360*768以上