Dimensionality Reduction

From Ioannis Kourouklides

This page contains resources about Dimensionality Reduction, Model Order Reduction , Blind Signal Separation, Source Separation, Subspace Learning, Continuous Latent Variable Models, including Feature Selection and Feature Extraction.

Subfields and Concepts[edit]

  • Supervised Dimensionality Reduction
    • Linear Discriminant Analysis (LDA)
      • Fisher Linear Discriminant (FDA)
    • Quadratic Discriminant Analysis (QDA)
    • Mixture Discriminant Analysis (MDA)
    • Neural Network Matrix Factorization (NNMF)
    • Feature Selection
  • Unsupervised Dimensionality Reduction
    • Singular Value Decomposition (SVD)
    • Principal Component Analysis (PCA) / Proper Orthogonal Decomposition (POD)
    • Probabilistic PCA (PPCA)
    • Canonical-Correlation Analysis
    • Independent Component Analysis (ICA)
    • Projection Pursuit
    • Exploratory Factor Analysis (EFA)
    • Singular Spectrum Analysis (SSA)
    • Empirical Orthogonal Function (EOF) Analysis
    • Nonnegative Matrix Factorization (NMF or NNMF)
    • Principal Tensor Analysis / Non‐negative Tensor Factorization
    • Multinomial PCA
    • Truncated SVD / Latent Semantic Analysis / Latent Semantic Indexing
    • Maximum-Margin (Minimum-Norm) Matrix Factorization
    • Common Spatial Pattern (especially used in EEG signals)
    • Artificial Neural Networks
      • Autoencoder
        • Linear Autoencoder (equivalent to PCA)
        • Stacked Denoising Autoencoder
        • Generalized Denoising Autoencoder
        • Sparse Autoencoder
        • Contractive Autoencoder (CAE)
        • Variational Autoencoder (VAE)
      • Kohonen Network / Self-organizing map (SOM) / Self-organising feature map (SOFM)
    • Unsupervised Deep Learning
      • Deep Autoencoder
    • K-SVD (used in Dictionary Learning)
  • Nonlinear Dimensionality Reduction
    • Manifold Learning (unsupervised, but supervised variants exist)
      • Autoencoder
      • SOM / SOFM
      • Gaussian Process Latent Variable Model (GPLVM)
      • Diffeomorphic Dimensionality Reduction / Diffeomap
      • Isomap
      • Locally Linear Embedding (LLE)
      • Hessian Eigenmapping or Hessian LLE (HLLE)
      • Modified Locally-Linear Embedding (MLLE)
      • Supervised LLE (SLLE)
      • Topologically Constrained Isometric Embedding (TCIE)
      • Laplacian Eigenmaps / Spectral Embedding
      • Stochastic Proximity Embedding (SPE)
      • Local Tangent Space Alignment (LTSA)
      • t-distributed stochastic neighbor embedding (t-SNE)
      • Local Multidimensional Scaling (MDS)
      • Kernel PCA (KPCA)
      • Nonlinear PCA (NPCA)
      • Nonlinear ICA
      • Curvilinear Component Analysis
      • Curvilinear Distance Analysis
      • Manifold Alignment
      • Diffusion Maps
      • Maximum Variance Unfolding
  • Latent Variable Models
  • Canonical Angles / Principal Angles (between subspaces)
  • Subspace Tracking
    • Grassmannian Rank-One Update Subspace Estimation (GROUSE)
    • Parallel Estimation and Tracking by REcursive Least Squares (PETRELS)
    • Multiscale Online Union of Subspaces Estimation (MOUSSE)
    • Grassmannian Robust Adaptive Subspace Tracking Algorithm (GRASTA)
    • Online Supervised Dimensionality Reduction (OSDR)

Online Courses[edit]

Video Lectures[edit]

Lecture Notes[edit]

Books and Book Chapters[edit]

  • Bengio, Y., Goodfellow, I. J., & Courville, A. (2016). "Chapter 13: Linear Factor Models". Deep Learning. MIT Press.
  • Theodoridis, S. (2015). "Chapter 19: Dimensionality Reduction". Machine Learning: A Bayesian and Optimization Perspective. Academic Press.
  • Hastie, T., Tibshirani, R., & Wainwright, M. (2015). "Chapter 7: Matrix Decompositions, Approximations, and Completion". Statistical learning with sparsity: the lasso and generalizations. CRC Press.
  • Shalev-Shwartz, S., & Ben-David, S. (2014). "Chapter 26: Dimensionality Reduction". Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press.
  • Sun, L., Ji, S., & Ye, J. (2013). Multi-Label Dimensionality Reduction. CRC Press.
  • Lu, H., Plataniotis, K. N., & Venetsanopoulos, A. (2013). Multilinear subspace learning: Dimensionality reduction of multidimensional data. CRC press.
  • Rajaraman, A., & Ullman, J. D. (2012). "Chapter 11: Dimensionality Reduction". Mining of Massive Datasets. Cambridge University Press.
  • Murphy, K. P. (2012). "Chapter 12: Latent linear models". Machine Learning: A Probabilistic Perspective. MIT Press.
  • Barber, D. (2012). "Chapter 15: Unsupervised Linear Dimension Reduction". Bayesian Reasoning and Machine Learning. Cambridge University Press.
  • Barber, D. (2012). "Chapter 16: Supervised Linear Dimension Reduction". Bayesian Reasoning and Machine Learning. Cambridge University Press.
  • Barber, D. (2012). "Chapter 21: Latent Linear Models". Bayesian Reasoning and Machine Learning. Cambridge University Press.
  • Alpaydin, E. (2010). "Chapter 6: Dimensionality Reduction". Introduction to machine learning. MIT Press.
  • Comon, P., & Jutten, C. (Eds.). (2010). Handbook of Blind Source Separation: Independent component analysis and applications. Academic press.
  • Gorban, A. N., Kégl, B., Wunsch, D. C., & Zinovyev, A. (2008). Principal Manifolds for Data Visualization and Dimension Reduction. Springer.
  • Ranjan, A. (2008). A New Approach for Blind Source Separation of Convolutive Sources. VDM Verlag.
  • Lee, J. A., & Verleysen, M. (2007). Nonlinear Dimensionality Reduction. Springer.
  • Skillicorn, D. (2007). Understanding complex datasets: data mining with matrix decompositions. CRC press.
  • Bishop, C. M. (2006). "Chapter 12: Continuous Latent Variables". Pattern Recognition and Machine Learning. Springer.
  • MacKay, D. J. (2003). "Chapter 34: Independent Component Analysis and Latent Variable Modelling " Information Theory, Inference and Learning Algorithms. Cambridge University Press.

Scholarly Articles[edit]

  • Sorzano, C. O. S., Vargas, J., & Montano, A. P. (2014). A survey of dimensionality reduction techniques. arXiv preprint arXiv:1403.2877.
  • Baur, U., Benner, P., & Feng, L. (2014). Model order reduction for linear and nonlinear systems: a system-theoretic perspective. Archives of Computational Methods in Engineering21(4), 331-358.
  • Gu, C. (2011). Model order reduction of nonlinear dynamical systems, PhD Diss. University of California, Berkeley.
  • Burges, C. J. (2010). Dimension Reduction: A Guided Tour. Foundations and Trends® in Machine Learning, 4(3). Now Publishers Inc.
  • Van Der Maaten, L., Postma, E., & Van den Herik, J. (2009). Dimensionality Reduction: A Comparative Review. Technical Report.
  • Cunningham, P. (2008). Dimension Reduction. In Machine Learning Techniques for Multimedia (pp. 91-112). Springer.
  • Fodor, I. K. (2002). A survey of Dimension Reduction Techniques.
  • Jain, A. K., Duin, R. P. W., & Mao, J. (2000). Statistical pattern recognition: A review. IEEE Transactions on pattern analysis and machine intelligence, 22(1), 4-37.



See also[edit]

Other Resources[edit]