Emergent self-adaptation in an integrated photonic neural network for backpropagation-free learning
- Author
- Alessio Lugnan (UGent) , Samarth Aggarwal, Frank Brueckerhoff-Plueckelmann, C. David Wright, Wolfram H. P. Pernice, Harish Bhaskaran and Peter Bienstman (UGent)
- Organization
- Project
-
- Photonic enabled Petascale in-memory computing with Femtojoule energy consumption
- NEUROmorphic energy-efficient secure accelerators based on Phase change materials aUgmented siLicon photonicS
- High-speed low-power neuromorphic photonic information processing with chaotic cavities
- Photonic Ising Machines
- Abstract
- Plastic self-adaptation, nonlinear recurrent dynamics and multi-scale memory are desired features in hardware implementations of neural networks, because they enable them to learn, adapt, and process information similarly to the way biological brains do. In this work, these properties occurring in arrays of photonic neurons are experimentally demonstrated. Importantly, this is realized autonomously in an emergent fashion, without the need for an external controller setting weights and without explicit feedback of a global reward signal. Using a hierarchy of such arrays coupled to a backpropagation-free training algorithm based on simple logistic regression, a performance of 98.2% is achieved on the MNIST task, a popular benchmark task looking at classification of written digits. The plastic nodes consist of silicon photonics microring resonators covered by a patch of phase-change material that implements nonvolatile memory. The system is compact, robust, and straightforward to scale up through the use of multiple wavelengths. Moreover, it constitutes a unique platform to test and efficiently implement biologically plausible learning schemes at a high processing speed.
- Keywords
- neuromorphic computing, machine learning, phase change materials, reservoir computing, self-adapting systems, silicon photonics, synaptic plasticity
Downloads
-
pub 3077.pdf
- full text (Published version)
- |
- open access
- |
- |
- 5.92 MB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-01JK8RP18DF1SPY4DG2G9NY5F5
- MLA
- Lugnan, Alessio, et al. “Emergent Self-Adaptation in an Integrated Photonic Neural Network for Backpropagation-Free Learning.” ADVANCED SCIENCE, vol. 12, no. 2, 2025, doi:10.1002/advs.202404920.
- APA
- Lugnan, A., Aggarwal, S., Brueckerhoff-Plueckelmann, F., Wright, C. D., Pernice, W. H. P., Bhaskaran, H., & Bienstman, P. (2025). Emergent self-adaptation in an integrated photonic neural network for backpropagation-free learning. ADVANCED SCIENCE, 12(2). https://doi.org/10.1002/advs.202404920
- Chicago author-date
- Lugnan, Alessio, Samarth Aggarwal, Frank Brueckerhoff-Plueckelmann, C. David Wright, Wolfram H. P. Pernice, Harish Bhaskaran, and Peter Bienstman. 2025. “Emergent Self-Adaptation in an Integrated Photonic Neural Network for Backpropagation-Free Learning.” ADVANCED SCIENCE 12 (2). https://doi.org/10.1002/advs.202404920.
- Chicago author-date (all authors)
- Lugnan, Alessio, Samarth Aggarwal, Frank Brueckerhoff-Plueckelmann, C. David Wright, Wolfram H. P. Pernice, Harish Bhaskaran, and Peter Bienstman. 2025. “Emergent Self-Adaptation in an Integrated Photonic Neural Network for Backpropagation-Free Learning.” ADVANCED SCIENCE 12 (2). doi:10.1002/advs.202404920.
- Vancouver
- 1.Lugnan A, Aggarwal S, Brueckerhoff-Plueckelmann F, Wright CD, Pernice WHP, Bhaskaran H, et al. Emergent self-adaptation in an integrated photonic neural network for backpropagation-free learning. ADVANCED SCIENCE. 2025;12(2).
- IEEE
- [1]A. Lugnan et al., “Emergent self-adaptation in an integrated photonic neural network for backpropagation-free learning,” ADVANCED SCIENCE, vol. 12, no. 2, 2025.
@article{01JK8RP18DF1SPY4DG2G9NY5F5,
abstract = {{Plastic self-adaptation, nonlinear recurrent dynamics and multi-scale memory are desired features in hardware implementations of neural networks, because they enable them to learn, adapt, and process information similarly to the way biological brains do. In this work, these properties occurring in arrays of photonic neurons are experimentally demonstrated. Importantly, this is realized autonomously in an emergent fashion, without the need for an external controller setting weights and without explicit feedback of a global reward signal. Using a hierarchy of such arrays coupled to a backpropagation-free training algorithm based on simple logistic regression, a performance of 98.2% is achieved on the MNIST task, a popular benchmark task looking at classification of written digits. The plastic nodes consist of silicon photonics microring resonators covered by a patch of phase-change material that implements nonvolatile memory. The system is compact, robust, and straightforward to scale up through the use of multiple wavelengths. Moreover, it constitutes a unique platform to test and efficiently implement biologically plausible learning schemes at a high processing speed.}},
articleno = {{2404920}},
author = {{Lugnan, Alessio and Aggarwal, Samarth and Brueckerhoff-Plueckelmann, Frank and Wright, C. David and Pernice, Wolfram H. P. and Bhaskaran, Harish and Bienstman, Peter}},
issn = {{2198-3844}},
journal = {{ADVANCED SCIENCE}},
keywords = {{neuromorphic computing,machine learning,phase change materials,reservoir computing,self-adapting systems,silicon photonics,synaptic plasticity}},
language = {{eng}},
number = {{2}},
pages = {{17}},
title = {{Emergent self-adaptation in an integrated photonic neural network for backpropagation-free learning}},
url = {{http://doi.org/10.1002/advs.202404920}},
volume = {{12}},
year = {{2025}},
}
- Altmetric
- View in Altmetric
- Web of Science
- Times cited: