MatteFormer: Transformer-Based Image Matting via Prior-Tokens
Abstract
In this paper, we propose a transformer-based image matting model called MatteFormer, which takes full advantage of trimap information in the transformer block. Our method first introduces a prior-token which is a global representation of each trimap region (e.g. foreground, background and unknown). These prior-tokens are used as global priors and participate in the self-attention mechanism of each block. Each stage of the encoder is composed of PAST (Prior-Attentive Swin Transformer) block, which is based on the Swin Transformer block, but differs in a couple of aspects: 1) It has PA-WSA (Prior-Attentive Window Self-Attention) layer, performing self-attention not only with spatial-tokens but also with prior-tokens. 2) It has prior-memory which saves prior-tokens accumulatively from the previous blocks and transfers them to the next block. We evaluate our MatteFormer on the commonly used image matting datasets: Composition-1k and Distinctions-646. Experiment results show that our proposed method achieves state-of-the-art performance with a large margin. Our codes are available at https://github.com/webtoon/matteformer.
Cite
Text
Park et al. "MatteFormer: Transformer-Based Image Matting via Prior-Tokens." Conference on Computer Vision and Pattern Recognition, 2022. doi:10.1109/CVPR52688.2022.01140Markdown
[Park et al. "MatteFormer: Transformer-Based Image Matting via Prior-Tokens." Conference on Computer Vision and Pattern Recognition, 2022.](https://mlanthology.org/cvpr/2022/park2022cvpr-matteformer/) doi:10.1109/CVPR52688.2022.01140BibTeX
@inproceedings{park2022cvpr-matteformer,
title = {{MatteFormer: Transformer-Based Image Matting via Prior-Tokens}},
author = {Park, GyuTae and Son, SungJoon and Yoo, JaeYoung and Kim, SeHo and Kwak, Nojun},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2022},
pages = {11696-11706},
doi = {10.1109/CVPR52688.2022.01140},
url = {https://mlanthology.org/cvpr/2022/park2022cvpr-matteformer/}
}