Searching Parameterized AP Loss for Object Detection

Abstract

Loss functions play an important role in training deep-network-based object detectors. The most widely used evaluation metric for object detection is Average Precision (AP), which captures the performance of localization and classification sub-tasks simultaneously. However, due to the non-differentiable nature of the AP metric, traditional object detectors adopt separate differentiable losses for the two sub-tasks. Such a mis-alignment issue may well lead to performance degradation. To address this, existing works seek to design surrogate losses for the AP metric manually, which requires expertise and may still be sub-optimal. In this paper, we propose Parameterized AP Loss, where parameterized functions are introduced to substitute the non-differentiable components in the AP calculation. Different AP approximations are thus represented by a family of parameterized functions in a unified formula. Automatic parameter search algorithm is then employed to search for the optimal parameters. Extensive experiments on the COCO benchmark with three different object detectors (i.e., RetinaNet, Faster R-CNN, and Deformable DETR) demonstrate that the proposed Parameterized AP Loss consistently outperforms existing handcrafted losses. Code shall be released.

Cite

Text

Chenxin et al. "Searching Parameterized AP Loss for Object Detection." Neural Information Processing Systems, 2021.

Markdown

[Chenxin et al. "Searching Parameterized AP Loss for Object Detection." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/chenxin2021neurips-searching/)

BibTeX

@inproceedings{chenxin2021neurips-searching,
  title     = {{Searching Parameterized AP Loss for Object Detection}},
  author    = {Chenxin, Tao and Li, Zizhang and Zhu, Xizhou and Huang, Gao and Liu, Yong and Dai, Jifeng},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/chenxin2021neurips-searching/}
}