Advanced search
1 file | 488.65 KB Add to list

Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller

Author
Organization
Abstract
Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)–(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance—competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.
Keywords
unsupervised learning, subject-to-subject transfer, BCI. BCI, ALGORITHM, BRAIN-COMPUTER-INTERFACE, visual event-related potentials, P300, speller matrix

Downloads

  • JNE ik.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 488.65 KB

Citation

Please use this url to cite or link to this publication:

MLA
Kindermans, Pieter-Jan, et al. “Integrating Dynamic Stopping, Transfer Learning and Language Models in an Adaptive Zero-Training ERP Speller.” JOURNAL OF NEURAL ENGINEERING, vol. 11, no. 3, 2014, doi:10.1088/1741-2560/11/3/035005.
APA
Kindermans, P.-J., Tangermann, M., Müller, K.-R., & Schrauwen, B. (2014). Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller. JOURNAL OF NEURAL ENGINEERING, 11(3). https://doi.org/10.1088/1741-2560/11/3/035005
Chicago author-date
Kindermans, Pieter-Jan, Michael Tangermann, Klaus-Robert Müller, and Benjamin Schrauwen. 2014. “Integrating Dynamic Stopping, Transfer Learning and Language Models in an Adaptive Zero-Training ERP Speller.” JOURNAL OF NEURAL ENGINEERING 11 (3). https://doi.org/10.1088/1741-2560/11/3/035005.
Chicago author-date (all authors)
Kindermans, Pieter-Jan, Michael Tangermann, Klaus-Robert Müller, and Benjamin Schrauwen. 2014. “Integrating Dynamic Stopping, Transfer Learning and Language Models in an Adaptive Zero-Training ERP Speller.” JOURNAL OF NEURAL ENGINEERING 11 (3). doi:10.1088/1741-2560/11/3/035005.
Vancouver
1.
Kindermans P-J, Tangermann M, Müller K-R, Schrauwen B. Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller. JOURNAL OF NEURAL ENGINEERING. 2014;11(3).
IEEE
[1]
P.-J. Kindermans, M. Tangermann, K.-R. Müller, and B. Schrauwen, “Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller,” JOURNAL OF NEURAL ENGINEERING, vol. 11, no. 3, 2014.
@article{4425479,
  abstract     = {{Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)–(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance—competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.}},
  articleno    = {{035005}},
  author       = {{Kindermans, Pieter-Jan and Tangermann, Michael and Müller, Klaus-Robert and Schrauwen, Benjamin}},
  issn         = {{1741-2560}},
  journal      = {{JOURNAL OF NEURAL ENGINEERING}},
  keywords     = {{unsupervised learning,subject-to-subject transfer,BCI. BCI,ALGORITHM,BRAIN-COMPUTER-INTERFACE,visual event-related potentials,P300,speller matrix}},
  language     = {{eng}},
  number       = {{3}},
  pages        = {{10}},
  title        = {{Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller}},
  url          = {{http://doi.org/10.1088/1741-2560/11/3/035005}},
  volume       = {{11}},
  year         = {{2014}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: