
Photonic delay systems as machine learning implementations
- Author
- Michiel Hermans (UGent) , Miguel C. Soriano, Joni Dambre (UGent) , Peter Bienstman (UGent) and Ingo Fischer
- Organization
- Project
- Abstract
- Nonlinear photonic delay systems present interesting implementation platforms for machine learning models. They can be extremely fast, offer great degrees of parallelism and potentially consume far less power than digital processors. So far they have been successfully employed for signal processing using the Reservoir Computing paradigm. In this paper we show that their range of applicability can be greatly extended if we use gradient descent with backpropagation through time on a model of the system to optimize the input encoding of such systems. We perform physical experiments that demonstrate that the obtained input encodings work well in reality, and we show that optimized systems perform significantly better than the common Reservoir Computing approach. The results presented here demonstrate that common gradient descent techniques from machine learning may well be applicable on physical neuro-inspired analog computers.
- Keywords
- recurrent neural networks, STATES, optical computing, machine learning models
Downloads
-
(...).pdf
- full text
- |
- UGent only
- |
- |
- 1.08 MB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-7196077
- MLA
- Hermans, Michiel, et al. “Photonic Delay Systems as Machine Learning Implementations.” JOURNAL OF MACHINE LEARNING RESEARCH, vol. 16, 2015, pp. 2081–97.
- APA
- Hermans, M., Soriano, M. C., Dambre, J., Bienstman, P., & Fischer, I. (2015). Photonic delay systems as machine learning implementations. JOURNAL OF MACHINE LEARNING RESEARCH, 16, 2081–2097.
- Chicago author-date
- Hermans, Michiel, Miguel C. Soriano, Joni Dambre, Peter Bienstman, and Ingo Fischer. 2015. “Photonic Delay Systems as Machine Learning Implementations.” JOURNAL OF MACHINE LEARNING RESEARCH 16: 2081–97.
- Chicago author-date (all authors)
- Hermans, Michiel, Miguel C. Soriano, Joni Dambre, Peter Bienstman, and Ingo Fischer. 2015. “Photonic Delay Systems as Machine Learning Implementations.” JOURNAL OF MACHINE LEARNING RESEARCH 16: 2081–2097.
- Vancouver
- 1.Hermans M, Soriano MC, Dambre J, Bienstman P, Fischer I. Photonic delay systems as machine learning implementations. JOURNAL OF MACHINE LEARNING RESEARCH. 2015;16:2081–97.
- IEEE
- [1]M. Hermans, M. C. Soriano, J. Dambre, P. Bienstman, and I. Fischer, “Photonic delay systems as machine learning implementations,” JOURNAL OF MACHINE LEARNING RESEARCH, vol. 16, pp. 2081–2097, 2015.
@article{7196077, abstract = {{Nonlinear photonic delay systems present interesting implementation platforms for machine learning models. They can be extremely fast, offer great degrees of parallelism and potentially consume far less power than digital processors. So far they have been successfully employed for signal processing using the Reservoir Computing paradigm. In this paper we show that their range of applicability can be greatly extended if we use gradient descent with backpropagation through time on a model of the system to optimize the input encoding of such systems. We perform physical experiments that demonstrate that the obtained input encodings work well in reality, and we show that optimized systems perform significantly better than the common Reservoir Computing approach. The results presented here demonstrate that common gradient descent techniques from machine learning may well be applicable on physical neuro-inspired analog computers.}}, author = {{Hermans, Michiel and Soriano, Miguel C. and Dambre, Joni and Bienstman, Peter and Fischer, Ingo}}, issn = {{1532-4435}}, journal = {{JOURNAL OF MACHINE LEARNING RESEARCH}}, keywords = {{recurrent neural networks,STATES,optical computing,machine learning models}}, language = {{eng}}, pages = {{2081--2097}}, title = {{Photonic delay systems as machine learning implementations}}, volume = {{16}}, year = {{2015}}, }