MI-DETR: An Object Detection Model with Multi-Time Inquiries Mechanism

Abstract

Based on analyzing the character of cascaded decoder architecture commonly adopted in existing DETR-like models, this paper proposes a new decoder architecture. The cascaded decoder architecture constrains object queries to update in the cascaded direction, only enabling object queries to learn relatively-limited information from image features. However, the challenges for object detection in natural scenes (e.g., extremely-small, heavily-occluded, and confusingly mixed with the background) require an object detection model to fully utilize image features, which motivates us to propose a new decoder architecture with the parallel Multi-time Inquiries (MI) mechanism. MI mechanism is very simple, enabling object queries to parallelly perform multi-time inquiries to learn more comprehensive information from image features. Our MI based model, MI-DETR, outperforms all existing DETR-like models on COCO benchmark under different backbones and training epochs, achieving +2.3 AP and +0.6 AP improvements compared to the most representative model DINO and SOTA model Relation-DETR under ResNet-50 backbone.

Cite

Text

Nan et al. "MI-DETR: An Object Detection Model with Multi-Time Inquiries Mechanism." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.00443

Markdown

[Nan et al. "MI-DETR: An Object Detection Model with Multi-Time Inquiries Mechanism." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/nan2025cvpr-midetr/) doi:10.1109/CVPR52734.2025.00443

BibTeX

@inproceedings{nan2025cvpr-midetr,
  title     = {{MI-DETR: An Object Detection Model with Multi-Time Inquiries Mechanism}},
  author    = {Nan, Zhixiong and Li, Xianghong and Dai, Jifeng and Xiang, Tao},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {4703-4712},
  doi       = {10.1109/CVPR52734.2025.00443},
  url       = {https://mlanthology.org/cvpr/2025/nan2025cvpr-midetr/}
}