Ghent University Academic Bibliography

Advanced

One step backpropagation through time for learning input mapping in reservoir computing applied to speech recognition

Michiel Hermans and Benjamin Schrauwen (2010) IEEE International Symposium on Circuits and Systems. p.521-524
abstract
Recurrent neural networks are very powerful engines for processing information that is coded in time, however, many problems with common training algorithms, such as Backpropagation Through Time, remain. Because of this, another important learning setup known as Reservoir Computing has appeared in recent years, where one uses an essentially untrained network to perform computations. Though very successful in many applications, using a random network can be quite inefficient when considering the required number of neurons and the associated computational costs. In this paper we introduce a highly simplified version of Backpropagation Through Time by basically truncating the error backpropagation to one step back in time, and we combine this with the classic Reservoir Computing setup using an instantaneous linear readout. We apply this setup to a spoken digit recognition task and show it to give very good results for small networks.
Please use this url to cite or link to this publication:
author
organization
year
type
conference (proceedingsPaper)
publication status
published
subject
keyword
speech recognition, backpropagation, learning (artificial intelligence), recurrent neural nets, NEURAL-NETWORKS, SYSTEMS
in
IEEE International Symposium on Circuits and Systems
issue title
2010 IEEE International symposium on circuits and systems
pages
521 - 524
publisher
IEEE
place of publication
New York, NY, USA
conference name
2010 IEEE International symposium on Circuits and Systems (ISCAS 2010) : Nano-bio circuit fabrics and systems
conference location
Paris, France
conference start
2010-05-30
conference end
2010-06-02
Web of Science type
Proceedings Paper
Web of Science id
000287216000130
ISSN
0271-4302
ISBN
9781424453092
9781424453085
DOI
10.1109/ISCAS.2010.5537568
language
English
UGent publication?
yes
classification
P1
copyright statement
I have transferred the copyright for this publication to the publisher
id
1065982
handle
http://hdl.handle.net/1854/LU-1065982
date created
2010-10-27 11:12:39
date last changed
2017-01-02 09:52:21
@inproceedings{1065982,
  abstract     = {Recurrent neural networks are very powerful engines for processing information that is coded in time, however, many problems with common training algorithms, such as Backpropagation Through Time, remain. Because of this, another important learning setup known as Reservoir Computing has appeared in recent years, where one uses an essentially untrained network to perform computations. Though very successful in many applications, using a random network can be quite inefficient when considering the required number of neurons and the associated computational costs. In this paper we introduce a highly simplified version of Backpropagation Through Time by basically truncating the error backpropagation to one step back in time, and we combine this with the classic Reservoir Computing setup using an instantaneous linear readout. We apply this setup to a spoken digit recognition task and show it to give very good results for small networks.},
  author       = {Hermans, Michiel and Schrauwen, Benjamin},
  booktitle    = {IEEE International Symposium on Circuits and Systems},
  isbn         = {9781424453092},
  issn         = {0271-4302},
  keyword      = {speech recognition,backpropagation,learning (artificial intelligence),recurrent neural nets,NEURAL-NETWORKS,SYSTEMS},
  language     = {eng},
  location     = {Paris, France},
  pages        = {521--524},
  publisher    = {IEEE},
  title        = {One step backpropagation through time for learning input mapping in reservoir computing applied to speech recognition},
  url          = {http://dx.doi.org/10.1109/ISCAS.2010.5537568},
  year         = {2010},
}

Chicago
Hermans, Michiel, and Benjamin Schrauwen. 2010. “One Step Backpropagation Through Time for Learning Input Mapping in Reservoir Computing Applied to Speech Recognition.” In IEEE International Symposium on Circuits and Systems, 521–524. New York, NY, USA: IEEE.
APA
Hermans, Michiel, & Schrauwen, B. (2010). One step backpropagation through time for learning input mapping in reservoir computing applied to speech recognition. IEEE International Symposium on Circuits and Systems (pp. 521–524). Presented at the 2010 IEEE International symposium on Circuits and Systems (ISCAS 2010) : Nano-bio circuit fabrics and systems, New York, NY, USA: IEEE.
Vancouver
1.
Hermans M, Schrauwen B. One step backpropagation through time for learning input mapping in reservoir computing applied to speech recognition. IEEE International Symposium on Circuits and Systems. New York, NY, USA: IEEE; 2010. p. 521–4.
MLA
Hermans, Michiel, and Benjamin Schrauwen. “One Step Backpropagation Through Time for Learning Input Mapping in Reservoir Computing Applied to Speech Recognition.” IEEE International Symposium on Circuits and Systems. New York, NY, USA: IEEE, 2010. 521–524. Print.