Balanced Learning for Domain Adaptive Semantic Segmentation
Abstract
Unsupervised domain adaptation (UDA) for semantic segmentation aims to transfer knowledge from a labeled source domain to an unlabeled target domain. Despite the effectiveness of self-training techniques in UDA, they struggle to learn each class in a balanced manner due to inherent class imbalance and distribution shift in both data and label space between domains. To address this issue, we propose Balanced Learning for Domain Adaptation (BLDA), a novel approach to directly assess and alleviate class bias without requiring prior knowledge about the distribution shift. First, we identify over-predicted and under-predicted classes by analyzing the distribution of predicted logits. Subsequently, we introduce a post-hoc approach to align the logits distributions across different classes using shared anchor distributions. To further consider the network’s need to generate unbiased pseudo-labels during self-training, we estimate logits distributions online and incorporate logits correction terms into the loss function. Moreover, we leverage the resulting cumulative density as domain-shared structural knowledge to connect the source and target domains. Extensive experiments on two standard UDA semantic segmentation benchmarks demonstrate that BLDA consistently improves performance, especially for under-predicted classes, when integrated into various existing methods.
Cite
Text
Li et al. "Balanced Learning for Domain Adaptive Semantic Segmentation." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Li et al. "Balanced Learning for Domain Adaptive Semantic Segmentation." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/li2025icml-balanced/)BibTeX
@inproceedings{li2025icml-balanced,
title = {{Balanced Learning for Domain Adaptive Semantic Segmentation}},
author = {Li, Wangkai and Sun, Rui and Liao, Bohao and Li, Zhaoyang and Zhang, Tianzhu},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {35858-35883},
volume = {267},
url = {https://mlanthology.org/icml/2025/li2025icml-balanced/}
}