Advanced search
1 file | 288.43 KB

Energy-based temporal neural networks for imputing missing values

Philémon Brakel (UGent) and Benjamin Schrauwen (UGent)
Author
Organization
Abstract
Imputing missing values in high dimensional time series is a difficult problem. There have been some approaches to the problem [11,8] where neural architectures were trained as probabilistic models of the data. However, we argue that this approach is not optimal. We propose to view temporal neural networks with latent variables as energy-based models and train them for missing value recovery directly. In this paper we introduce two energy-based models. The first model is based on a one dimensional convolution and the second model utilizes a recurrent neural network. We demonstrate how ideas from the energy-based learning framework can be used to train these models to recover missing values. The models are evaluated on a motion capture dataset.
Keywords
neural networks, energy-based models, time series, missing values, machine learning, optimization

Downloads

  • pbiconip.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 288.43 KB

Citation

Please use this url to cite or link to this publication:

Chicago
Brakel, Philémon, and Benjamin Schrauwen. 2012. “Energy-based Temporal Neural Networks for Imputing Missing Values.” In Lecture Notes in Computer Science, ed. T Huang, 7664:575–582. Berlin, Germany: Springer.
APA
Brakel, P., & Schrauwen, B. (2012). Energy-based temporal neural networks for imputing missing values. In T. Huang (Ed.), LECTURE NOTES IN COMPUTER SCIENCE (Vol. 7664, pp. 575–582). Presented at the 19th International Conference on Neural Information Processing (ICONIP - 2012), Berlin, Germany: Springer.
Vancouver
1.
Brakel P, Schrauwen B. Energy-based temporal neural networks for imputing missing values. In: Huang T, editor. LECTURE NOTES IN COMPUTER SCIENCE. Berlin, Germany: Springer; 2012. p. 575–82.
MLA
Brakel, Philémon, and Benjamin Schrauwen. “Energy-based Temporal Neural Networks for Imputing Missing Values.” Lecture Notes in Computer Science. Ed. T Huang. Vol. 7664. Berlin, Germany: Springer, 2012. 575–582. Print.
@inproceedings{3055728,
  abstract     = {Imputing missing values in high dimensional time series is a difficult problem. There have been some approaches to the problem [11,8] where neural architectures were trained as probabilistic models of the data. However, we argue that this approach is not optimal. We propose to view temporal neural networks with latent variables as energy-based models and train them for missing value recovery directly. In this paper we introduce two energy-based models. The first model is based on a one dimensional convolution and the second model utilizes a recurrent neural network. We demonstrate how ideas from the energy-based learning framework can be used to train these models to recover missing values. The models are evaluated on a motion capture dataset.},
  author       = {Brakel, Phil{\'e}mon and Schrauwen, Benjamin},
  booktitle    = {LECTURE NOTES IN COMPUTER SCIENCE},
  editor       = {Huang, T },
  isbn         = {9783642344800},
  issn         = {0302-9743},
  keyword      = {neural networks,energy-based models,time series,missing values,machine learning,optimization},
  language     = {eng},
  location     = {Doha, Qatar},
  pages        = {575--582},
  publisher    = {Springer},
  title        = {Energy-based temporal neural networks for imputing missing values},
  volume       = {7664},
  year         = {2012},
}

Web of Science
Times cited: