Advanced search
1 file | 351.60 KB

Memory in reservoirs for high dimensional input

Michiel Hermans (UGent) and Benjamin Schrauwen (UGent)
Author
Organization
Abstract
Reservoir Computing (RC) is a recently introduced scheme to employ recurrent neural networks while circumventing the difficulties that typically appear when training the recurrent weights. The ‘reservoir’ is a fixed randomly initiated recurrent network which receives input via a random mapping. Only an instantaneous linear mapping from the network to the output is trained which can be done with linear regression. In this paper we study dynamical properties of reservoirs receiving a high number of inputs. More specifically, we investigate how the internal state of the network retains fading memory of its input signal. Memory properties for random recurrent networks have been thoroughly examined in past research, but only for one-dimensional input. Here we take into account statistics which will typically occur in high dimensional signals. We find useful empirical data which expresses how memory in recurrent networks is distributed over the individual principal components of the input.
Keywords
NEURAL-NETWORKS

Downloads

  • IJCNN2010.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 351.60 KB

Citation

Please use this url to cite or link to this publication:

Chicago
Hermans, Michiel, and Benjamin Schrauwen. 2010. “Memory in Reservoirs for High Dimensional Input.” In IEEE International Joint Conference on Neural Networks (IJCNN). New York, NY, USA: IEEE.
APA
Hermans, M., & Schrauwen, B. (2010). Memory in reservoirs for high dimensional input. IEEE International Joint Conference on Neural Networks (IJCNN). Presented at the 2010 IEEE International joint conference on Neural Networks (IJCNN 2010) ; World congress on Computational Intelligence (WCCI 2010), New York, NY, USA: IEEE.
Vancouver
1.
Hermans M, Schrauwen B. Memory in reservoirs for high dimensional input. IEEE International Joint Conference on Neural Networks (IJCNN). New York, NY, USA: IEEE; 2010.
MLA
Hermans, Michiel, and Benjamin Schrauwen. “Memory in Reservoirs for High Dimensional Input.” IEEE International Joint Conference on Neural Networks (IJCNN). New York, NY, USA: IEEE, 2010. Print.
@inproceedings{1066010,
  abstract     = {Reservoir Computing (RC) is a recently introduced scheme to employ recurrent neural networks while circumventing the difficulties that typically appear when training the recurrent weights. The ‘reservoir’ is a fixed randomly initiated recurrent network which receives input via a random mapping. Only an instantaneous linear mapping from the network to the output is trained which can be done with linear regression. In this paper we study dynamical properties of reservoirs receiving a high number of inputs. More specifically, we investigate how the internal state of the network retains fading memory of its input signal. Memory properties for random recurrent networks have been thoroughly examined in past research, but only for one-dimensional input. Here we take into account statistics which will typically occur in high dimensional signals. We find useful empirical data which expresses how memory in recurrent networks is distributed over the individual principal components of the input.},
  author       = {Hermans, Michiel and Schrauwen, Benjamin},
  booktitle    = {IEEE International Joint Conference on Neural Networks (IJCNN)},
  isbn         = {9781424469178},
  issn         = {1098-7576},
  keywords     = {NEURAL-NETWORKS},
  language     = {eng},
  location     = {Barcelona, Spain},
  pages        = {7},
  publisher    = {IEEE},
  title        = {Memory in reservoirs for high dimensional input},
  url          = {http://dx.doi.org/10.1109/IJCNN.2010.5596884},
  year         = {2010},
}

Altmetric
View in Altmetric
Web of Science
Times cited: