ShadowFormer: Global Context Helps Shadow Removal

Abstract

Recent deep learning methods have achieved promising results in image shadow removal. However, most of the existing approaches focus on working locally within shadow and non-shadow regions, resulting in severe artifacts around the shadow boundaries as well as inconsistent illumination between shadow and non-shadow regions. It is still challenging for the deep shadow removal model to exploit the global contextual correlation between shadow and non-shadow regions. In this work, we first propose a Retinex-based shadow model, from which we derive a novel transformer-based network, dubbed ShandowFormer, to exploit non-shadow regions to help shadow region restoration. A multi-scale channel attention framework is employed to hierarchically capture the global information. Based on that, we propose a Shadow-Interaction Module (SIM) with Shadow-Interaction Attention (SIA) in the bottleneck stage to effectively model the context correlation between shadow and non-shadow regions. We conduct extensive experiments on three popular public datasets, including ISTD, ISTD+, and SRD, to evaluate the proposed method. Our method achieves state-of-the-art performance by using up to 150X fewer model parameters.

Cite

Text

Guo et al. "ShadowFormer: Global Context Helps Shadow Removal." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I1.25148

Markdown

[Guo et al. "ShadowFormer: Global Context Helps Shadow Removal." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/guo2023aaai-shadowformer/) doi:10.1609/AAAI.V37I1.25148

BibTeX

@inproceedings{guo2023aaai-shadowformer,
  title     = {{ShadowFormer: Global Context Helps Shadow Removal}},
  author    = {Guo, Lanqing and Huang, Siyu and Liu, Ding and Cheng, Hao and Wen, Bihan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {710-718},
  doi       = {10.1609/AAAI.V37I1.25148},
  url       = {https://mlanthology.org/aaai/2023/guo2023aaai-shadowformer/}
}