A Generative Foundation Model for Antibody Sequence Understanding
Abstract
Here we introduce FAbCon, a generative antibody-specific language model comprising 2.4 billion parameters. A commonly accepted wisdom in developing large language models is that increasing model scale will translate to higher performance on downstream tasks. Starting from a 144-million parameter setup, we show that progressively larger models achieve greater accuracy in predicting antigen binding and can also be used to design new antibodies with good predicted developability potential.
Cite
Text
Barton et al. "A Generative Foundation Model for Antibody Sequence Understanding." ICML 2024 Workshops: AccMLBio, 2024.Markdown
[Barton et al. "A Generative Foundation Model for Antibody Sequence Understanding." ICML 2024 Workshops: AccMLBio, 2024.](https://mlanthology.org/icmlw/2024/barton2024icmlw-generative/)BibTeX
@inproceedings{barton2024icmlw-generative,
title = {{A Generative Foundation Model for Antibody Sequence Understanding}},
author = {Barton, Justin and Gaspariunas, Aretas and Yadin, David A and Dias, Jorge and Nice, Francesca L and Minns, Danielle H and Snudden, Olivia and Povall, Chelsea and Tomas, Sara Valle and Dobson, Harry and Farmery, James H R and Leem, Jinwoo and Galson, Jacob D},
booktitle = {ICML 2024 Workshops: AccMLBio},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/barton2024icmlw-generative/}
}