Cascading Convolutional Color Constancy

Abstract

Regressing the illumination of a scene from the representations of object appearances is popularly adopted in computational color constancy. However, it's still challenging due to intrinsic appearance and label ambiguities caused by unknown illuminants, diverse reflection properties of materials and extrinsic imaging factors (such as different camera sensors). In this paper, we introduce a novel algorithm – Cascading Convolutional Color Constancy (in short, C4) to improve robustness of regression learning and achieve stable generalization capability across datasets (different cameras and scenes) in a unique framework. The proposed C4 method ensembles a series of dependent illumination hypotheses from each cascade stage via introducing a weighted multiply-accumulate loss function, which can inherently capture different modes of illuminations and explicitly enforce coarse-to-fine network optimization. Experimental results on the public Color Checker and NUS 8-Camera benchmarks demonstrate superior performance of the proposed algorithm in comparison with the state-of-the-art methods, especially for more difficult scenes.

Cite

Text

Yu et al. "Cascading Convolutional Color Constancy." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I07.6966

Markdown

[Yu et al. "Cascading Convolutional Color Constancy." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/yu2020aaai-cascading/) doi:10.1609/AAAI.V34I07.6966

BibTeX

@inproceedings{yu2020aaai-cascading,
  title     = {{Cascading Convolutional Color Constancy}},
  author    = {Yu, Huanglin and Chen, Ke and Wang, Kaiqi and Qian, Yanlin and Zhang, Zhaoxiang and Jia, Kui},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {12725-12732},
  doi       = {10.1609/AAAI.V34I07.6966},
  url       = {https://mlanthology.org/aaai/2020/yu2020aaai-cascading/}
}