Channel Selection for Test-Time Adaptation Under Distribution Shift

Abstract

To ensure robustness and generalization to real-world scenarios, test-time adaptation has been recently studied as an approach to adjust models to a new data distribution during inference. Test-time batch normalization is a simple and popular method that achieved compelling performance on domain shift benchmarks by recalculating batch normalization statistics on test batches. However, in many practical applications this technique is vulnerable to label distribution shifts. We propose to tackle this challenge by only selectively adapting channels in a deep network, minimizing drastic adaptation that is sensitive to label shifts. We find that adapted models significantly improve the performance compared to the baseline models and counteract unknown label shifts.

Cite

Text

Vianna et al. "Channel Selection for Test-Time Adaptation Under Distribution Shift." NeurIPS 2023 Workshops: DistShift, 2023.

Markdown

[Vianna et al. "Channel Selection for Test-Time Adaptation Under Distribution Shift." NeurIPS 2023 Workshops: DistShift, 2023.](https://mlanthology.org/neuripsw/2023/vianna2023neuripsw-channel/)

BibTeX

@inproceedings{vianna2023neuripsw-channel,
  title     = {{Channel Selection for Test-Time Adaptation Under Distribution Shift}},
  author    = {Vianna, Pedro and Chaudhary, Muawiz Sajjad and Tang, An and Cloutier, Guy and Wolf, Guy and Eickenberg, Michael and Belilovsky, Eugene},
  booktitle = {NeurIPS 2023 Workshops: DistShift},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/vianna2023neuripsw-channel/}
}