Skip to content

Robotmurlock/Gaussian-Mixture-Model-EM-Tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Gaussian Mixture Model and Expectation Maximization

Implementation and comprehensive explanation of Gaussian Mixture Model generative probabilistic algorithm using Expectation Maximization optimization method. Tutorial is represented in Jupyter notebook. It can also be found on Kaggle.

Expectation Maximization is a general method that can be used for parameter estimation that optimize maximum likelihood given the observed data with latent variables (data is not fully observed). It has many more applications (e.g. dynamic model parameter estimation) besides GMM but this is a good approach to learn intuition behind the optimization algorithm.

Reference

Main reference for this tutorial is Pattern Recognition and Machine Learning by Christopher M. Bishop. This book available online (click on the link).

About

Implementation of EM algorithm for Gaussian Mixture model.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published