Beyond Universal Saliency: Personalized Saliency Prediction with Multi-Task CNN

Abstract

Saliency detection is a long standing problem in computer vision. Tremendous efforts have been focused on exploring a universal saliency model across users despite their differences in gender, race, age, etc. Yet recent psychology studies suggest that saliency is highly specific than universal: individuals exhibit heterogeneous gaze patterns when viewing an identical scene containing multiple salient objects. In this paper, we first show that such heterogeneity is common and critical for reliable saliency prediction. Our study also produces the first database of personalized saliency maps (PSMs). We model PSM based on universal saliency map (USM) shared by different participants and adopt a multi-task CNN framework to estimate the discrepancy between PSM and USM. Comprehensive experiments demonstrate that our new PSM model and prediction scheme are effective and reliable.

Cite

Text

Xu et al. "Beyond Universal Saliency: Personalized Saliency Prediction with Multi-Task CNN." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/543

Markdown

[Xu et al. "Beyond Universal Saliency: Personalized Saliency Prediction with Multi-Task CNN." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/xu2017ijcai-beyond/) doi:10.24963/IJCAI.2017/543

BibTeX

@inproceedings{xu2017ijcai-beyond,
  title     = {{Beyond Universal Saliency: Personalized Saliency Prediction with Multi-Task CNN}},
  author    = {Xu, Yanyu and Li, Nianyi and Wu, Junru and Yu, Jingyi and Gao, Shenghua},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {3887-3893},
  doi       = {10.24963/IJCAI.2017/543},
  url       = {https://mlanthology.org/ijcai/2017/xu2017ijcai-beyond/}
}