Color-Sensitive Person Re-Identification
Abstract
Recent deep Re-ID models mainly focus on learning high-level semantic features, while failing to explicitly explore color information which is one of the most important cues for person Re-ID. In this paper, we propose a novel Color-Sensitive Re-ID to take full advantage of color information. On one hand, we train our model with real and fake images. By using the extra fake images, more color information can be exploited and it can avoid overfitting during training. On the other hand, we also train our model with images of the same person with different colors. By doing so, features can be forced to focus on the color difference in regions. To generate fake images with specified colors, we propose a novel Color Translation GAN (CTGAN) to learn mappings between different clothing colors and preserve identity consistency among the same clothing color. Extensive evaluations on two benchmark datasets show that our approach significantly outperforms state-of-the-art Re-ID models.
Cite
Text
Wang et al. "Color-Sensitive Person Re-Identification." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/131Markdown
[Wang et al. "Color-Sensitive Person Re-Identification." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/wang2019ijcai-color/) doi:10.24963/IJCAI.2019/131BibTeX
@inproceedings{wang2019ijcai-color,
title = {{Color-Sensitive Person Re-Identification}},
author = {Wang, Guan'an and Yang, Yang and Cheng, Jian and Wang, Jinqiao and Hou, Zeng-Guang},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2019},
pages = {933-939},
doi = {10.24963/IJCAI.2019/131},
url = {https://mlanthology.org/ijcai/2019/wang2019ijcai-color/}
}