MONITRS: Multimodal Observations of Natural Incidents Through Remote Sensing

Abstract

Natural disasters cause devastating damage to communities and infrastructure every year. Effective disaster response is hampered by the difficulty of accessing affected areas during and after events. Remote sensing has allowed us to monitor natural disasters in a remote way. More recently there have been advances in computer vision and deep learning that help automate satellite imagery analysis, However, they remain limited by their narrow focus on specific disaster types, reliance on manual expert interpretation, and lack of datasets with sufficient temporal granularity or natural language annotations for tracking disaster progression. We present MONITRS, a novel multimodal dataset of $\sim$10,000 FEMA disaster events with temporal satellite imagery with natural language annotations from news articles, accompanied by geotagged locations, and question-answer pairs. We demonstrate that fine-tuning existing MLLMs on our dataset yields significant performance improvements for disaster monitoring tasks, establishing a new benchmark for machine learning-assisted disaster response systems.

Cite

Text

Revankar et al. "MONITRS: Multimodal Observations of Natural Incidents Through Remote Sensing." Advances in Neural Information Processing Systems, 2025.

Markdown

[Revankar et al. "MONITRS: Multimodal Observations of Natural Incidents Through Remote Sensing." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/revankar2025neurips-monitrs/)

BibTeX

@inproceedings{revankar2025neurips-monitrs,
  title     = {{MONITRS: Multimodal Observations of Natural Incidents Through Remote Sensing}},
  author    = {Revankar, Shreelekha and Mall, Utkarsh and Phoo, Cheng Perng and Bala, Kavita and Hariharan, Bharath},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/revankar2025neurips-monitrs/}
}