Bandit Algorithms Boost Brain Computer Interfaces for Motor-Task Selection of a Brain-Controlled Button

Abstract

A brain-computer interface (BCI) allows users to “communicate” with a computer without using their muscles. BCI based on sensori-motor rhythms use imaginary motor tasks, such as moving the right or left hand to send control signals. The performances of a BCI can vary greatly across users but also depend on the tasks used, making the problem of appropriate task selection an important issue. This study presents a new procedure to automatically select as fast as possible a discriminant motor task for a brain-controlled button. We develop for this purpose an adaptive algorithm UCB-classif based on the stochastic bandit theory. This shortens the training stage, thereby allowing the exploration of a greater variety of tasks. By not wasting time on inefficient tasks, and focusing on the most promising ones, this algorithm results in a faster task selection and a more efficient use of the BCI training session. Comparing the proposed method to the standard practice in task selection, for a fixed time budget, UCB-classif leads to an improve classification rate, and for a fix classification rate, to a reduction of the time spent in training by 50%.

Cite

Text

Fruitet et al. "Bandit Algorithms Boost Brain Computer Interfaces for Motor-Task Selection of a Brain-Controlled Button." Neural Information Processing Systems, 2012.

Markdown

[Fruitet et al. "Bandit Algorithms Boost Brain Computer Interfaces for Motor-Task Selection of a Brain-Controlled Button." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/fruitet2012neurips-bandit/)

BibTeX

@inproceedings{fruitet2012neurips-bandit,
  title     = {{Bandit Algorithms Boost Brain Computer Interfaces for Motor-Task Selection of a Brain-Controlled Button}},
  author    = {Fruitet, Joan and Carpentier, Alexandra and Clerc, Maureen and Munos, Rémi},
  booktitle = {Neural Information Processing Systems},
  year      = {2012},
  pages     = {449-457},
  url       = {https://mlanthology.org/neurips/2012/fruitet2012neurips-bandit/}
}