Deep Atrous Guided Filter for Image Restoration in Under Display Cameras
Abstract
Under Display Cameras present a promising opportunity for phone manufacturers to achieve bezel-free displays by positioning the camera behind semi-transparent OLED screens. Unfortunately, such imaging systems suffer from severe image degradation due to light attenuation and diffraction effects. In this work, we present Deep Atrous Guided Filter (DAGF), a two-stage, end-to-end approach for image restoration in UDC systems. A Low-Resolution Network first restores image quality at low-resolution, which is subsequently used by the Guided Filter Network as a filtering input to produce a high-resolution output. Besides the initial downsampling, our low-resolution network uses multiple, parallel atrous convolutions to preserve spatial resolution and emulates multi-scale processing. Our approach's ability to directly train on megapixel images results in significant performance improvement. We additionally propose a simple simulation scheme to pre-train our model and boost performance. Our overall framework ranks 2nd and 5th in the RLQ-TOD'20 UDC Challenge for POLED and TOLED displays, respectively.
Cite
Text
Sundar et al. "Deep Atrous Guided Filter for Image Restoration in Under Display Cameras." European Conference on Computer Vision Workshops, 2020. doi:10.1007/978-3-030-68238-5_29Markdown
[Sundar et al. "Deep Atrous Guided Filter for Image Restoration in Under Display Cameras." European Conference on Computer Vision Workshops, 2020.](https://mlanthology.org/eccvw/2020/sundar2020eccvw-deep/) doi:10.1007/978-3-030-68238-5_29BibTeX
@inproceedings{sundar2020eccvw-deep,
title = {{Deep Atrous Guided Filter for Image Restoration in Under Display Cameras}},
author = {Sundar, Varun and Hegde, Sumanth and Kothandaraman, Divya and Mitra, Kaushik},
booktitle = {European Conference on Computer Vision Workshops},
year = {2020},
pages = {379-397},
doi = {10.1007/978-3-030-68238-5_29},
url = {https://mlanthology.org/eccvw/2020/sundar2020eccvw-deep/}
}