Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision

Abstract

Sorting and ranking supervision is a method for training neural networks end-to-end based on ordering constraints. That is, the ground truth order of sets of samples is known, while their absolute values remain unsupervised. For that, we propose differentiable sorting networks by relaxing their pairwise conditional swap operations. To address the problems of vanishing gradients and extensive blurring that arise with larger numbers of layers, we propose mapping activations to regions with moderate gradients. We consider odd-even as well as bitonic sorting networks, which outperform existing relaxations of the sorting operation. We show that bitonic sorting networks can achieve stable training on large input sets of up to 1024 elements.

Cite

Text

Petersen et al. "Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision." International Conference on Machine Learning, 2021.

Markdown

[Petersen et al. "Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/petersen2021icml-differentiable/)

BibTeX

@inproceedings{petersen2021icml-differentiable,
  title     = {{Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision}},
  author    = {Petersen, Felix and Borgelt, Christian and Kuehne, Hilde and Deussen, Oliver},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {8546-8555},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/petersen2021icml-differentiable/}
}