User-Centric Affective Computing of Image Emotion Perceptions
Abstract
We propose to predict the personalized emotion perceptions of images for each viewer. Different factors that may influence emotion perceptions, including visual content, social context, temporal evolution, and location influence are jointly investigated via the presented rolling multi-task hypergraph learning. For evaluation, we set up a large scale image emotion dataset from Flickr, named Image-Emotion-Social-Net, with over 1 million images and about 8,000 users. Experiments conducted on this dataset demonstrate the superiority of the proposed method, as compared to state-of-the-art.
Cite
Text
Zhao et al. "User-Centric Affective Computing of Image Emotion Perceptions." AAAI Conference on Artificial Intelligence, 2016. doi:10.1609/AAAI.V30I1.9947Markdown
[Zhao et al. "User-Centric Affective Computing of Image Emotion Perceptions." AAAI Conference on Artificial Intelligence, 2016.](https://mlanthology.org/aaai/2016/zhao2016aaai-user/) doi:10.1609/AAAI.V30I1.9947BibTeX
@inproceedings{zhao2016aaai-user,
title = {{User-Centric Affective Computing of Image Emotion Perceptions}},
author = {Zhao, Sicheng and Yao, Hongxun and Xie, Wenlong and Jiang, Xiaolei},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2016},
pages = {4284-4285},
doi = {10.1609/AAAI.V30I1.9947},
url = {https://mlanthology.org/aaai/2016/zhao2016aaai-user/}
}