AMST: Alternating Multimodal Skip Training

Abstract

Multimodal Learning is one of the many fields in Machine Learning where models leverage the combination of various modalities to enhance learning outcomes. However, modalities may differ in data representation and complexity, which can lead to learning imbalances during the training process. The time it takes for a certain modality to converge during training is a crucial metric to determine modality imbalance. Given differences in convergence rates, different modalities may harmfully interfere with each other’s learning process when simultaneously trained, as is commonly done in a multimodal scenario. To mitigate this negative impact, we propose Alternating Multimodal Skip Training (AMST) where the training frequency is adjusted for each specific modality. This novel method not only improves performance in conventional multimodal models that learn with fused modalities but also enhances alternating models that train each modality separately. Additionally, it outperforms state-of-the-art models while reducing training times.

Cite

Text

Silva et al. "AMST: Alternating Multimodal Skip Training." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2025. doi:10.1007/978-3-032-06078-5_30

Markdown

[Silva et al. "AMST: Alternating Multimodal Skip Training." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2025.](https://mlanthology.org/ecmlpkdd/2025/silva2025ecmlpkdd-amst/) doi:10.1007/978-3-032-06078-5_30

BibTeX

@inproceedings{silva2025ecmlpkdd-amst,
  title     = {{AMST: Alternating Multimodal Skip Training}},
  author    = {Silva, Hugo Manuel Alves Henriques e and Chen, Hongguang and Selpi, },
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2025},
  pages     = {526-541},
  doi       = {10.1007/978-3-032-06078-5_30},
  url       = {https://mlanthology.org/ecmlpkdd/2025/silva2025ecmlpkdd-amst/}
}