Advanced search
1 file | 316.60 KB

Efficiency Evaluation of Character-level RNN Training Schedules

Cedric De Boom (UGent) , Sam Leroux (UGent) , Steven Bohez (UGent) , Pieter Simoens (UGent) , Thomas Demeester (UGent) and Bart Dhoedt (UGent)
Author
Organization
Abstract
We present four training and prediction schedules from the same character-level recurrent neural network. The efficiency of these schedules is tested in terms of model effectiveness as a function of training time and amount of training data seen. We conclude that the sequence to sequence training, together with sequence to sample prediction, performs the most efficient and consistent across multiple parameter settings. We show that the choice of training and prediction schedule potentially has a considerable impact on the prediction effectiveness for a given training budget.
Keywords
IBCN, neural networks, deep learning, efficiency, RNN, machine learning, text, characters

Downloads

  • paper.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 316.60 KB

Citation

Please use this url to cite or link to this publication:

Chicago
De Boom, Cedric, Sam Leroux, Steven Bohez, Pieter Simoens, Thomas Demeester, and Bart Dhoedt. 2016. “Efficiency Evaluation of Character-level RNN Training Schedules.” In Data Efficient Machine Learning Workshop, ICML.
APA
De Boom, C., Leroux, S., Bohez, S., Simoens, P., Demeester, T., & Dhoedt, B. (2016). Efficiency Evaluation of Character-level RNN Training Schedules. Data Efficient Machine Learning workshop, ICML. Presented at the Data Efficient Machine Learning workshop (ICML 2016).
Vancouver
1.
De Boom C, Leroux S, Bohez S, Simoens P, Demeester T, Dhoedt B. Efficiency Evaluation of Character-level RNN Training Schedules. Data Efficient Machine Learning workshop, ICML. 2016.
MLA
De Boom, Cedric, Sam Leroux, Steven Bohez, et al. “Efficiency Evaluation of Character-level RNN Training Schedules.” Data Efficient Machine Learning Workshop, ICML. 2016. Print.
@inproceedings{8023862,
  abstract     = {We present four training and prediction schedules from the same character-level recurrent neural network. The efficiency of these schedules is tested in terms of model effectiveness as a function of training time and amount of training data seen. We conclude that the sequence to sequence training, together with sequence to sample prediction, performs the most efficient and consistent across multiple parameter settings. We show that the choice of training and prediction schedule potentially has a considerable impact on the prediction effectiveness for a given training budget.},
  author       = {De Boom, Cedric and Leroux, Sam and Bohez, Steven and Simoens, Pieter and Demeester, Thomas and Dhoedt, Bart},
  booktitle    = {Data Efficient Machine Learning workshop, ICML},
  keyword      = {IBCN,neural networks,deep learning,efficiency,RNN,machine learning,text,characters},
  language     = {eng},
  location     = {New York, USA},
  pages        = {2},
  title        = {Efficiency Evaluation of Character-level RNN Training Schedules},
  year         = {2016},
}