Mixture of Balanced Information Bottlenecks for Long-Tailed Visual Recognition

Abstract

Deep neural networks (DNNs) have achieved significant success in various applications with large-scale and balanced data. However, data in real-world visual recognition are usually long-tailed, bringing challenges to efficient training and deployment of DNNs. Information bottleneck (IB) is an elegant approach for representation learning. In this paper, we propose a balanced information bottleneck (BIB) approach, in which loss function re-balancing and self-distillation techniques are integrated into the original IB network. BIB is thus capable of learning a sufficient representation with essential label-related information fully preserved for long-tailed visual recognition. To further enhance the representation learning capability, we also propose a novel structure of mixture of multiple balanced information bottlenecks (MBIB), where different BIBs are responsible for combining knowledge from different network layers. MBIB facilitates an end-to-end learning strategy that trains representation and classification simultaneously from an information theory perspective. We conduct experiments on commonly used long-tailed datasets, including CIFAR100-LT, ImageNet-LT, and iNaturalist 2018. Both BIB and MBIB reach state-of-the-art performance for long-tailed visual recognition.

Cite

Text

Lan et al. "Mixture of Balanced Information Bottlenecks for Long-Tailed Visual Recognition." Transactions on Machine Learning Research, 2025.

Markdown

[Lan et al. "Mixture of Balanced Information Bottlenecks for Long-Tailed Visual Recognition." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/lan2025tmlr-mixture/)

BibTeX

@article{lan2025tmlr-mixture,
  title     = {{Mixture of Balanced Information Bottlenecks for Long-Tailed Visual Recognition}},
  author    = {Lan, Yifan and Xin, Cai and Cheng, Jun and Tan, Shan},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/lan2025tmlr-mixture/}
}