Interactive Mars Image Content-Based Search with Interpretable Machine Learning

Abstract

The NASA Planetary Data System (PDS) hosts millions of images of planets, moons, and other bodies collected throughout many missions. The ever-expanding nature of data and user engagement demands an interpretable content classification system to support scientific discovery and individual curiosity. In this paper, we leverage a prototype-based architecture to enable users to understand and validate the evidence used by a classifier trained on images from the Mars Science Laboratory (MSL) Curiosity rover mission. In addition to providing explanations, we investigate the diversity and correctness of evidence used by the content-based classifier. The work presented in this paper will be deployed on the PDS Image Atlas, replacing its non-interpretable counterpart.

Cite

Text

Vasu et al. "Interactive Mars Image Content-Based Search with Interpretable Machine Learning." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I21.30338

Markdown

[Vasu et al. "Interactive Mars Image Content-Based Search with Interpretable Machine Learning." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/vasu2024aaai-interactive/) doi:10.1609/AAAI.V38I21.30338

BibTeX

@inproceedings{vasu2024aaai-interactive,
  title     = {{Interactive Mars Image Content-Based Search with Interpretable Machine Learning}},
  author    = {Vasu, Bhavan and Lu, Steven and Dunkel, Emily and Wagstaff, Kiri L. and Grimes, Kevin and McAuley, Michael},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {22976-22982},
  doi       = {10.1609/AAAI.V38I21.30338},
  url       = {https://mlanthology.org/aaai/2024/vasu2024aaai-interactive/}
}