Exploring Generalisability of Self-Distillation with No Labels for SAR-Based Vegetation Prediction

Abstract

In this work we pre-train a DINO-ViT based model using two Synthetic Aperture Radar datasets (S1GRD or GSSIC) across three regions (China, Conus, Europe). We fine-tune the models on smaller labeled datasets to predict vegetation percentage, and empirically study the connection between the embedding space of the models and their ability to generalize across diverse geographic regions and to unseen data. For S1GRD, embedding spaces of different regions are clearly separated, while GSSIC's overlaps. Positional patterns remain during fine-tuning, and greater distances in embeddings often result in higher errors for unfamiliar regions. With this, our work increases our understanding of generalizability for self-supervised models applied to remote sensing.

Cite

Text

Martínez-Ferrer et al. "Exploring Generalisability of Self-Distillation with No Labels for SAR-Based Vegetation Prediction." NeurIPS 2023 Workshops: DistShift, 2023.

Markdown

[Martínez-Ferrer et al. "Exploring Generalisability of Self-Distillation with No Labels for SAR-Based Vegetation Prediction." NeurIPS 2023 Workshops: DistShift, 2023.](https://mlanthology.org/neuripsw/2023/martinezferrer2023neuripsw-exploring/)

BibTeX

@inproceedings{martinezferrer2023neuripsw-exploring,
  title     = {{Exploring Generalisability of Self-Distillation with No Labels for SAR-Based Vegetation Prediction}},
  author    = {Martínez-Ferrer, Laura and Jungbluth, Anna and Mejia, Joseph Alejandro Gallego and Allen, Matt and Dorr, Francisco and Kalaitzis, Freddie and Ramos-Pollán, Raúl},
  booktitle = {NeurIPS 2023 Workshops: DistShift},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/martinezferrer2023neuripsw-exploring/}
}