Mixture Model

From Ioannis Kourouklides
Jump to navigation Jump to search

This page contains resources about Mixture Models and the Expectation-Maximization Algorithm.

Subfields and Concepts[edit]

  • Bayesian Models
  • Non-Bayesian Models
  • Mixture of Experts
  • Hierarchical Mixture of Experts
  • k-Nearest Neighbour (k-NN)
  • Gaussian Mixture Model (GMM)
  • Deep Latent Gaussian Mixture Model (DLGMMs)
  • Mixture of Factor Analyzers
  • Mixture of Dimensionality Reducers
  • Latent Variable Models
  • Maximum a posteriori (MAP) EM Algorithm
  • Sparse EM Algorithm
  • Baum-Welch Algorithm (i.e. EM when applied to HMMs)
  • Generalized (Incomplete) EM Algorithm
  • Monte Carlo EM
  • Variational EM
  • Variational Bayesian EM (VBEM)

Online Courses[edit]

Video Lectures[edit]

Lecture Notes[edit]

Books and Book Chapters[edit]

  • Theodoridis, S. (2015). "Chapter 12.5: Latent Variables and the EM Algorithm" Machine Learning: A Bayesian and Optimization Perspective. Academic Press.
  • Murphy, K. P. (2012). "Chapter 11: Mixture models and the EM algorithm ". Machine Learning: A Probabilistic Perspective. MIT Press.
  • Barber, D. (2012). "Chapter 11: Learning with Hidden Variables". Bayesian Reasoning and Machine Learning. Cambridge University Press.
  • Barber, D. (2012). "Chapter 20: Mixture Models". Bayesian Reasoning and Machine Learning. Cambridge University Press.
  • McLachlan, G., & Krishnan, T. (2007). The EM algorithm and extensions. John Wiley & Sons.
  • Bishop, C. M. (2006). "Chapter 9: Mixture Models and EM". Pattern Recognition and Machine Learning. Springer.
  • McLachlan, G., & Peel, D. (2004). Finite mixture models. John Wiley & Sons.

Scholarly Articles[edit]

  • Nalisnick, E., Hertel, L., & Smyth, P. (2016). Approximate inference for deep latent gaussian mixtures. In NIPS Workshop on Bayesian Deep Learning.



See also[edit]

Other Resources[edit]