Advanced search
1 file | 676.19 KB Add to list

Factoring variations in natural images with deep Gaussian mixture models

Author
Organization
Abstract
Generative models can be seen as the swiss army knives of machine learning, as many problems can be written probabilistically in terms of the distribution of the data, including prediction, reconstruction, imputation and simulation. One of the most promising directions for unsupervised learning may lie in Deep Learning methods, given their success in supervised learning. However, one of the cur- rent problems with deep unsupervised learning methods, is that they often are harder to scale. As a result there are some easier, more scalable shallow meth- ods, such as the Gaussian Mixture Model and the Student-t Mixture Model, that remain surprisingly competitive. In this paper we propose a new scalable deep generative model for images, called the Deep Gaussian Mixture Model, that is a straightforward but powerful generalization of GMMs to multiple layers. The parametrization of a Deep GMM allows it to efficiently capture products of vari- ations in natural images. We propose a new EM-based algorithm that scales well to large datasets, and we show that both the Expectation and the Maximization steps can easily be distributed over multiple machines. In our density estimation experiments we show that deeper GMM architectures generalize better than more shallow ones, with results in the same ballpark as the state of the art.

Downloads

  • factoring-variations-in-natural-images-with-deep-gaussian-mixture-models
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 676.19 KB

Citation

Please use this url to cite or link to this publication:

MLA
van den Oord, Aäron, and Benjamin Schrauwen. “Factoring Variations in Natural Images with Deep Gaussian Mixture Models.” Neural Information Processing Systems, Proceedings, 2014.
APA
van den Oord, A., & Schrauwen, B. (2014). Factoring variations in natural images with deep Gaussian mixture models. Neural Information Processing Systems, Proceedings. Presented at the Neural Information Processing Systems, Montreal.
Chicago author-date
Oord, Aäron van den, and Benjamin Schrauwen. 2014. “Factoring Variations in Natural Images with Deep Gaussian Mixture Models.” In Neural Information Processing Systems, Proceedings.
Chicago author-date (all authors)
van den Oord, Aäron, and Benjamin Schrauwen. 2014. “Factoring Variations in Natural Images with Deep Gaussian Mixture Models.” In Neural Information Processing Systems, Proceedings.
Vancouver
1.
van den Oord A, Schrauwen B. Factoring variations in natural images with deep Gaussian mixture models. In: Neural Information Processing Systems, Proceedings. 2014.
IEEE
[1]
A. van den Oord and B. Schrauwen, “Factoring variations in natural images with deep Gaussian mixture models,” in Neural Information Processing Systems, Proceedings, Montreal, 2014.
@inproceedings{5824055,
  abstract     = {{Generative models can be seen as the swiss army knives of machine learning, as many problems can be written probabilistically in terms of the distribution of the data, including prediction, reconstruction, imputation and simulation. One of the most promising directions for unsupervised learning may lie in Deep Learning methods, given their success in supervised learning. However, one of the cur- rent problems with deep unsupervised learning methods, is that they often are harder to scale. As a result there are some easier, more scalable shallow meth- ods, such as the Gaussian Mixture Model and the Student-t Mixture Model, that remain surprisingly competitive. In this paper we propose a new scalable deep generative model for images, called the Deep Gaussian Mixture Model, that is a straightforward but powerful generalization of GMMs to multiple layers. The parametrization of a Deep GMM allows it to efficiently capture products of vari- ations in natural images. We propose a new EM-based algorithm that scales well to large datasets, and we show that both the Expectation and the Maximization steps can easily be distributed over multiple machines. In our density estimation experiments we show that deeper GMM architectures generalize better than more shallow ones, with results in the same ballpark as the state of the art.}},
  author       = {{van den Oord, Aäron and Schrauwen, Benjamin}},
  booktitle    = {{Neural Information Processing Systems, Proceedings}},
  language     = {{eng}},
  location     = {{Montreal}},
  pages        = {{9}},
  title        = {{Factoring variations in natural images with deep Gaussian mixture models}},
  year         = {{2014}},
}