Semantic Regularisation for Recurrent Image Annotation

Abstract

The "CNN-RNN" design pattern is increasingly widely applied in a variety of image annotation tasks including multi-label classification and captioning. Existing models use the weakly semantic CNN hidden layer or its transform as the image embedding that provides the interface between the CNN and RNN. This leaves the RNN overstretched with two jobs: predicting the visual concepts and modelling their correlations for generating structured annotation output. Importantly this makes the end-to-end training of the CNN and RNN slow and ineffective due to the difficulty of back propagating gradients through the RNN to train the CNN. We propose a simple modification to the design pattern that makes learning more effective and efficient. Specifically, we propose to use a semantically regularised embedding layer as the interface between the CNN and RNN. Regularising the interface can partially or completely decouple the learning problems, allowing each to be more effectively trained and jointly training much more efficient. Extensive experiments show that state-of-the art performance is achieved on multi-label classification as well as image captioning.

Cite

Text

Liu et al. "Semantic Regularisation for Recurrent Image Annotation." Conference on Computer Vision and Pattern Recognition, 2017. doi:10.1109/CVPR.2017.443

Markdown

[Liu et al. "Semantic Regularisation for Recurrent Image Annotation." Conference on Computer Vision and Pattern Recognition, 2017.](https://mlanthology.org/cvpr/2017/liu2017cvpr-semantic/) doi:10.1109/CVPR.2017.443

BibTeX

@inproceedings{liu2017cvpr-semantic,
  title     = {{Semantic Regularisation for Recurrent Image Annotation}},
  author    = {Liu, Feng and Xiang, Tao and Hospedales, Timothy M. and Yang, Wankou and Sun, Changyin},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2017},
  doi       = {10.1109/CVPR.2017.443},
  url       = {https://mlanthology.org/cvpr/2017/liu2017cvpr-semantic/}
}