Brant: Foundation Model for Intracranial Neural Signal
Abstract
We propose a foundation model named Brant for modeling intracranial recordings, which learns powerful representations of intracranial neural signals by pre-training, providing a large-scale, off-the-shelf model for medicine. Brant is the largest model in the field of brain signals and is pre-trained on a large corpus of intracranial data collected by us. The design of Brant is to capture long-term temporal dependency and spatial correlation from neural signals, combining the information in both time and frequency domains. As a foundation model, Brant achieves SOTA performance on various downstream tasks (i.e. neural signal forecasting, frequency-phase forecasting, imputation and seizure detection), showing the generalization ability to a broad range of tasks. The low-resource label analysis and representation visualization further illustrate the effectiveness of our pre-training strategy. In addition, we explore the effect of model size to show that a larger model with a higher capacity can lead to performance improvements on our dataset. The source code and pre-trained weights are available at: https://zju-brainnet.github.io/Brant.github.io/.
Cite
Text
Zhang et al. "Brant: Foundation Model for Intracranial Neural Signal." Neural Information Processing Systems, 2023.Markdown
[Zhang et al. "Brant: Foundation Model for Intracranial Neural Signal." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/zhang2023neurips-brant/)BibTeX
@inproceedings{zhang2023neurips-brant,
title = {{Brant: Foundation Model for Intracranial Neural Signal}},
author = {Zhang, Daoze and Yuan, Zhizhang and Yang, Yang and Chen, Junru and Wang, Jingjing and Li, Yafeng},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/zhang2023neurips-brant/}
}