Boosted Zero-Shot Learning with Semantic Correlation Regularization
Abstract
We study zero-shot learning (ZSL) as a transfer learning problem, and focus on the two key aspects of ZSL, model effectiveness and model adaptation. For effective modeling, we adopt the boosting strategy to learn a zero-shot classifier from weak models to a strong model. For adaptable knowledge transfer, we devise a Semantic Correlation Regularization (SCR) approach to regularize the boosted model to be consistent with the inter-class semantic correlations. With SCR embedded in the boosting objective, and with a self-controlled sample selection for learning robustness, we propose a unified framework, Boosted Zero-shot classification with Semantic Correlation Regularization (BZ-SCR). By balancing the SCR-regularized boosted model selection and the self-controlled sample selection, BZ-SCR is capable of capturing both discriminative and adaptable feature-to-class semantic alignments, while ensuring the reliability and adaptability of the learned samples. The experiments on two ZSL datasets show the superiority of BZ-SCR over the state-of-the-arts.
Cite
Text
Pi et al. "Boosted Zero-Shot Learning with Semantic Correlation Regularization." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/362Markdown
[Pi et al. "Boosted Zero-Shot Learning with Semantic Correlation Regularization." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/pi2017ijcai-boosted/) doi:10.24963/IJCAI.2017/362BibTeX
@inproceedings{pi2017ijcai-boosted,
title = {{Boosted Zero-Shot Learning with Semantic Correlation Regularization}},
author = {Pi, Te and Li, Xi and Zhang, Zhongfei (Mark)},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2017},
pages = {2599-2605},
doi = {10.24963/IJCAI.2017/362},
url = {https://mlanthology.org/ijcai/2017/pi2017ijcai-boosted/}
}