Learning Together Securely: Prototype-Based Federated Multi-Modal Hashing for Safe and Efficient Multi-Modal Retrieval

Abstract

With the proliferation of multi-modal data, safe and efficient multi-modal hashing retrieval has become a pressing research challenge, particularly due to concerns over data privacy during centralized processing. To address this, we propose Prototype-based Federated Multi-modal Hashing (PFMH), an innovative framework that seamlessly integrates federated learning with multi-modal hashing techniques. PFMH achieves fine-grained fusion of heterogeneous multi-modal data, enhancing retrieval accuracy while ensuring data privacy through prototype-based communication, thereby reducing communication costs and mitigating risks of data leakage. Furthermore, using a prototype completion strategy, PFMH tackles class imbalance and statistical heterogeneity in multi-modal data, improving model generalization and performance across diverse data distributions. Extensive experiments demonstrate the efficiency and effectiveness of PFMH within the federated learning framework, enabling distributed training for secure and precise multi-modal retrieval in real-world scenarios.

Cite

Text

Zuo et al. "Learning Together Securely: Prototype-Based Federated Multi-Modal Hashing for Safe and Efficient Multi-Modal Retrieval." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I21.34475

Markdown

[Zuo et al. "Learning Together Securely: Prototype-Based Federated Multi-Modal Hashing for Safe and Efficient Multi-Modal Retrieval." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/zuo2025aaai-learning/) doi:10.1609/AAAI.V39I21.34475

BibTeX

@inproceedings{zuo2025aaai-learning,
  title     = {{Learning Together Securely: Prototype-Based Federated Multi-Modal Hashing for Safe and Efficient Multi-Modal Retrieval}},
  author    = {Zuo, Ruifan and Zheng, Chaoqun and Zhu, Lei and Lu, Wenpeng and Xiang, Yuanyuan and Li, Zhao and Qu, Xiaofeng},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {23108-23116},
  doi       = {10.1609/AAAI.V39I21.34475},
  url       = {https://mlanthology.org/aaai/2025/zuo2025aaai-learning/}
}