A Strong Baseline for Molecular Few-Shot Learning

Abstract

Few-shot learning has recently attracted significant interest in drug discovery, with a recent, fast-growing literature mostly involving convoluted meta-learning strategies. We revisit the more straightforward fine-tuning approach for molecular data, and propose a regularized quadratic-probe loss based on the the Mahalanobis distance. We design a dedicated block-coordinate descent optimizer, which avoid the degenerate solutions of our loss. Interestingly, our simple fine-tuning approach achieves highly competitive performances in comparison to state-of-the-art methods, while being applicable to black-box settings and removing the need for specific episodic pre-training strategies. Furthermore, we introduce a new benchmark to assess the robustness of the competing methods to domain shifts. In this setting, our fine-tuning baseline obtains consistently better results than meta-learning methods.

Cite

Text

Formont et al. "A Strong Baseline for Molecular Few-Shot Learning." Transactions on Machine Learning Research, 2025.

Markdown

[Formont et al. "A Strong Baseline for Molecular Few-Shot Learning." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/formont2025tmlr-strong/)

BibTeX

@article{formont2025tmlr-strong,
  title     = {{A Strong Baseline for Molecular Few-Shot Learning}},
  author    = {Formont, Philippe and Jeannin, Hugo and Piantanida, Pablo and Ayed, Ismail Ben},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/formont2025tmlr-strong/}
}