Latent-Guided Exemplar-Based Image Re-Colorization
Abstract
Exemplar-based re-colorization transfers colors from a reference to a colored or grayscale source image, accounting for the semantic correspondences between the two. Existing grayscale colorization methods usually predict only the chromatic aberration while maintaining the source's luminance. Consequently, the result's color may diverge from the reference due to such luminance difference. On the other hand, global photorealistic stylization without segmentation cannot handle scenarios where different parts of the scene need different colors. To overcome this issue, we propose a novel and effective method for re-colorization: 1) We first exploit the spatial-adaptive latent space of SpaceEdit in the context of the re-colorization task and achieve re-colorization via latent maps prediction through a proposed network. 2) We then delve into SpaceEdit's self-reconstruct latent codes and maps to better characterize the global style and local color property, based on which we construct a novel loss to supervise re-colorization. Qualitative and quantitative results show that our method outperforms previous works by generating superior outputs with more consistent colors and global styles based on references.
Cite
Text
Yang et al. "Latent-Guided Exemplar-Based Image Re-Colorization." Winter Conference on Applications of Computer Vision, 2024.Markdown
[Yang et al. "Latent-Guided Exemplar-Based Image Re-Colorization." Winter Conference on Applications of Computer Vision, 2024.](https://mlanthology.org/wacv/2024/yang2024wacv-latentguided/)BibTeX
@inproceedings{yang2024wacv-latentguided,
title = {{Latent-Guided Exemplar-Based Image Re-Colorization}},
author = {Yang, Wenjie and Xu, Ning and Fan, Yifei},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2024},
pages = {4250-4259},
url = {https://mlanthology.org/wacv/2024/yang2024wacv-latentguided/}
}