UniIR: Training and Benchmarking Universal Multimodal Information Retrievers

Abstract

Existing information retrieval (IR) models often assume a homogeneous format, limiting their applicability to diverse user needs, such as searching for images with text descriptions, searching for a news article with a headline image, or finding a similar photo with a query image. To approach such different information-seeking demands, we introduce UniIR, a unified instruction-guided multimodal retriever capable of handling eight distinct retrieval tasks across modalities. UniIR, a single retrieval system jointly trained on ten diverse multimodal-IR datasets, interprets user instructions to execute various retrieval tasks, demonstrating robust performance across existing datasets and zero-shot generalization to new tasks. Our experiments highlight that multi-task training and instruction tuning are keys to UniIR’s generalization ability. Additionally, we construct the M-BEIR, a multimodal retrieval benchmark with comprehensive results, to standardize the evaluation of universal multimodal information retrieval.

Cite

Text

Wei et al. "UniIR: Training and Benchmarking Universal Multimodal Information Retrievers." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-73021-4_23

Markdown

[Wei et al. "UniIR: Training and Benchmarking Universal Multimodal Information Retrievers." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/wei2024eccv-uniir/) doi:10.1007/978-3-031-73021-4_23

BibTeX

@inproceedings{wei2024eccv-uniir,
  title     = {{UniIR: Training and Benchmarking Universal Multimodal Information Retrievers}},
  author    = {Wei, Cong and Chen, Yang and Chen, Haonan and Hu, Hexiang and Zhang, Ge and Fu, Jie and Ritter, Alan and Chen, Wenhu},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2024},
  doi       = {10.1007/978-3-031-73021-4_23},
  url       = {https://mlanthology.org/eccv/2024/wei2024eccv-uniir/}
}