Balancing Invariant and Specific Knowledge for Domain Generalization with Online Knowledge Distillation
Abstract
Recent research has demonstrated the effectiveness of knowledge distillation in Domain Generalization. However, existing approaches often overlook domain-specific knowledge and rely on an offline distillation strategy, limiting the effectiveness of knowledge transfer. To address these limitations, we propose Balanced Online knowLedge Distillation (BOLD). BOLD leverages a multi-domain expert teacher model, with each expert specializing in a specific source domain, enabling the student to distill both domain-invariant and domain-specific knowledge. We incorporate the Pareto optimization principle and uncertainty weighting to balance these two types of knowledge, ensuring simultaneous optimization without compromising either. Additionally, BOLD employs an online knowledge distillation strategy, allowing the teacher and student to learn concurrently. This dynamic interaction enables the teacher to adapt based on student feedback, facilitating more effective knowledge transfer. Extensive experiments on seven benchmarks demonstrate that BOLD outperforms state-of-the-art methods. Furthermore, we provide theoretical insights that highlight the importance of domain-specific knowledge and the advantages of uncertainty weighting.
Cite
Text
Zhao et al. "Balancing Invariant and Specific Knowledge for Domain Generalization with Online Knowledge Distillation." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/272Markdown
[Zhao et al. "Balancing Invariant and Specific Knowledge for Domain Generalization with Online Knowledge Distillation." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/zhao2025ijcai-balancing/) doi:10.24963/IJCAI.2025/272BibTeX
@inproceedings{zhao2025ijcai-balancing,
title = {{Balancing Invariant and Specific Knowledge for Domain Generalization with Online Knowledge Distillation}},
author = {Zhao, Di and Zhang, Jingfeng and Hu, Hongsheng and Fournier-Viger, Philippe and Dobbie, Gillian and Koh, Yun Sing},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2025},
pages = {2440-2448},
doi = {10.24963/IJCAI.2025/272},
url = {https://mlanthology.org/ijcai/2025/zhao2025ijcai-balancing/}
}