Generating Natural Images with Direct Patch Distributions Matching

Abstract

Many traditional computer vision algorithms generate realistic images by requiring that each patch in the generated image be similar to a patch in a training image and vice versa. Recently, this classical approach has been replaced by adversarial training with a patch discriminator. The adversarial approach avoids the computational burden of finding nearest neighbors of patches but often requires very long training times and may fail to match the distribution of patches. In this paper we leverage the Sliced Wasserstein Distance to develop an algorithm that explicitly and efficiently minimizes the distance between patch distributions in two images. Our method is conceptually simple, requires no training and can be implemented in a few lines of codes. On a number of image generation tasks we show that our results are often superior to single-image-GANs, and can generate high quality images in a few seconds. Our implementation is publicly available at https://github.com/ariel415el/GPDM.

Cite

Text

Elnekave and Weiss. "Generating Natural Images with Direct Patch Distributions Matching." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-19790-1_33

Markdown

[Elnekave and Weiss. "Generating Natural Images with Direct Patch Distributions Matching." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/elnekave2022eccv-generating/) doi:10.1007/978-3-031-19790-1_33

BibTeX

@inproceedings{elnekave2022eccv-generating,
  title     = {{Generating Natural Images with Direct Patch Distributions Matching}},
  author    = {Elnekave, Ariel and Weiss, Yair},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2022},
  doi       = {10.1007/978-3-031-19790-1_33},
  url       = {https://mlanthology.org/eccv/2022/elnekave2022eccv-generating/}
}