Advanced search
1 file | 114.32 KB Add to list

Oger: modular learning architectures for large-scale sequential processing

Author
Organization
Abstract
Oger (OrGanic Environment for Reservoir computing) is a Python toolbox for building, training and evaluating modular learning architectures on large data sets. It builds on MDP for its modularity, and adds processing of sequential data sets, gradient descent training, several cross-validation schemes and parallel parameter optimization methods. Additionally, several learning algorithms are implemented, such as different reservoir implementations (both sigmoid and spiking), ridge regression, conditional restricted Boltzmann machine (CRBM) and others, including GPU accelerated versions. Oger is released under the GNU LGPL, and is available from http://organic.elis.ugent.be/oger.
Keywords
modular architectures, sequential processing, Python

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 114.32 KB

Citation

Please use this url to cite or link to this publication:

MLA
Verstraeten, David, Benjamin Schrauwen, Sander Dieleman, et al. “Oger: Modular Learning Architectures for Large-scale Sequential Processing.” JOURNAL OF MACHINE LEARNING RESEARCH 13 (2012): 2995–2998. Print.
APA
Verstraeten, D., Schrauwen, B., Dieleman, S., Brakel, P., Buteneers, P., & Pecevski, D. (2012). Oger: modular learning architectures for large-scale sequential processing. JOURNAL OF MACHINE LEARNING RESEARCH, 13, 2995–2998.
Chicago author-date
Verstraeten, David, Benjamin Schrauwen, Sander Dieleman, Philémon Brakel, Pieter Buteneers, and Dejan Pecevski. 2012. “Oger: Modular Learning Architectures for Large-scale Sequential Processing.” Journal of Machine Learning Research 13: 2995–2998.
Chicago author-date (all authors)
Verstraeten, David, Benjamin Schrauwen, Sander Dieleman, Philémon Brakel, Pieter Buteneers, and Dejan Pecevski. 2012. “Oger: Modular Learning Architectures for Large-scale Sequential Processing.” Journal of Machine Learning Research 13: 2995–2998.
Vancouver
1.
Verstraeten D, Schrauwen B, Dieleman S, Brakel P, Buteneers P, Pecevski D. Oger: modular learning architectures for large-scale sequential processing. JOURNAL OF MACHINE LEARNING RESEARCH. 2012;13:2995–8.
IEEE
[1]
D. Verstraeten, B. Schrauwen, S. Dieleman, P. Brakel, P. Buteneers, and D. Pecevski, “Oger: modular learning architectures for large-scale sequential processing,” JOURNAL OF MACHINE LEARNING RESEARCH, vol. 13, pp. 2995–2998, 2012.
@article{3054005,
  abstract     = {Oger (OrGanic Environment for Reservoir computing) is a Python toolbox for building, training and evaluating modular learning architectures on large data sets. It builds on MDP for its modularity, and adds processing of sequential data sets, gradient descent training, several cross-validation schemes and parallel parameter optimization methods. Additionally, several learning algorithms are implemented, such as different reservoir implementations (both sigmoid and spiking), ridge regression, conditional restricted Boltzmann machine (CRBM) and others, including GPU accelerated versions. Oger is released under the GNU LGPL, and is available from http://organic.elis.ugent.be/oger.},
  author       = {Verstraeten, David and Schrauwen, Benjamin and Dieleman, Sander and Brakel, Philémon and Buteneers, Pieter and Pecevski, Dejan},
  issn         = {1532-4435},
  journal      = {JOURNAL OF MACHINE LEARNING RESEARCH},
  keywords     = {modular architectures,sequential processing,Python},
  language     = {eng},
  pages        = {2995--2998},
  title        = {Oger: modular learning architectures for large-scale sequential processing},
  url          = {http://jmlr.csail.mit.edu/papers/volume13/verstraeten12a/verstraeten12a.pdf},
  volume       = {13},
  year         = {2012},
}

Web of Science
Times cited: