Learning Representations of Sets Through Optimized Permutations
Abstract
Representations of sets are challenging to learn because operations on sets should be permutation-invariant. To this end, we propose a Permutation-Optimisation module that learns how to permute a set end-to-end. The permuted set can be further processed to learn a permutation-invariant representation of that set, avoiding a bottleneck in traditional set models. We demonstrate our model's ability to learn permutations and set representations with either explicit or implicit supervision on four datasets, on which we achieve state-of-the-art results: number sorting, image mosaics, classification from image mosaics, and visual question answering.
Cite
Text
Zhang et al. "Learning Representations of Sets Through Optimized Permutations." International Conference on Learning Representations, 2019.Markdown
[Zhang et al. "Learning Representations of Sets Through Optimized Permutations." International Conference on Learning Representations, 2019.](https://mlanthology.org/iclr/2019/zhang2019iclr-learning/)BibTeX
@inproceedings{zhang2019iclr-learning,
title = {{Learning Representations of Sets Through Optimized Permutations}},
author = {Zhang, Yan and Hare, Jonathon and Prügel-Bennett, Adam},
booktitle = {International Conference on Learning Representations},
year = {2019},
url = {https://mlanthology.org/iclr/2019/zhang2019iclr-learning/}
}