Learning Local Implicit Fourier Representation for Image Warping
Abstract
Image warping aims to reshape images defined on rectangular grids into arbitrary shapes. Recently, implicit neural functions have shown remarkable performances in representing images in a continuous manner. However, a standalone multi-layer perceptron suffers from learning high-frequency Fourier coefficients. In this paper, we propose a local texture estimator for image warping (LTEW) followed by an implicit neural representation to deform images into continuous shapes. Local textures estimated from a deep super-resolution (SR) backbone are multiplied by locally-varying Jacobian matrices of a coordinate transformation to predict Fourier responses of a warped image. Our LTEW-based neural function outperforms existing warping methods for asymmetric-scale SR and homography transform. Furthermore, our algorithm well generalizes arbitrary coordinate transformations, such as homography transform with a large magnification factor and equirectangular projection (ERP) perspective transform, which are not provided in training.
Cite
Text
Lee et al. "Learning Local Implicit Fourier Representation for Image Warping." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-19797-0_11Markdown
[Lee et al. "Learning Local Implicit Fourier Representation for Image Warping." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/lee2022eccv-learning/) doi:10.1007/978-3-031-19797-0_11BibTeX
@inproceedings{lee2022eccv-learning,
title = {{Learning Local Implicit Fourier Representation for Image Warping}},
author = {Lee, Jaewon and Choi, Kwang Pyo and Jin, Kyong Hwan},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2022},
doi = {10.1007/978-3-031-19797-0_11},
url = {https://mlanthology.org/eccv/2022/lee2022eccv-learning/}
}