A Walkthrough for the Principle of Logit Separation

Abstract

We consider neural network training, in applications in which there are many possible classes, but at test-time, the task is a binary classification task of determining whether the given example belongs to a specific class. We define the Single Logit Classification (SLC) task: training the network so that at test-time, it would be possible to accurately identify whether the example belongs to a given class in a computationally efficient manner, based only on the output logit for this class. We propose a natural principle, the Principle of Logit Separation, as a guideline for choosing and designing loss functions that are suitable for SLC. We show that the Principle of Logit Separation is a crucial ingredient for success in the SLC task, and that SLC results in considerable speedups when the number of classes is large.

Cite

Text

Keren et al. "A Walkthrough for the Principle of Logit Separation." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/861

Markdown

[Keren et al. "A Walkthrough for the Principle of Logit Separation." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/keren2019ijcai-walkthrough/) doi:10.24963/IJCAI.2019/861

BibTeX

@inproceedings{keren2019ijcai-walkthrough,
  title     = {{A Walkthrough for the Principle of Logit Separation}},
  author    = {Keren, Gil and Sabato, Sivan and Schuller, Björn W.},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {6191-6195},
  doi       = {10.24963/IJCAI.2019/861},
  url       = {https://mlanthology.org/ijcai/2019/keren2019ijcai-walkthrough/}
}