Generative Models for Multi-Illumination Color Constancy
Abstract
In this paper, the aim is multi-illumination color constancy. However, most of the existing color constancy methods are designed for single light sources. Furthermore, datasets for learning multiple illumination color constancy are largely missing. We propose a seed (physics driven) based multi-illumination color constancy method. GANs are exploited to model the illumination estimation problem as an image-to-image domain translation problem. Additionally, a novel multi-illumination data augmentation method is proposed. Experiments on single and multi-illumination datasets show that our methods outperform sota methods.
Cite
Text
Das et al. "Generative Models for Multi-Illumination Color Constancy." IEEE/CVF International Conference on Computer Vision Workshops, 2021. doi:10.1109/ICCVW54120.2021.00139Markdown
[Das et al. "Generative Models for Multi-Illumination Color Constancy." IEEE/CVF International Conference on Computer Vision Workshops, 2021.](https://mlanthology.org/iccvw/2021/das2021iccvw-generative/) doi:10.1109/ICCVW54120.2021.00139BibTeX
@inproceedings{das2021iccvw-generative,
title = {{Generative Models for Multi-Illumination Color Constancy}},
author = {Das, Partha and Liu, Yang and Karaoglu, Sezer and Gevers, Theo},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2021},
pages = {1194-1203},
doi = {10.1109/ICCVW54120.2021.00139},
url = {https://mlanthology.org/iccvw/2021/das2021iccvw-generative/}
}