AdaptGrad: Adaptive Sampling to Reduce Noise

Abstract

Gradient smoothing is an efficient approach to reducing noise in gradient-based model explanation methods. SmoothGrad adds Gaussian noise to mitigate much of this noise. However, the crucial hyperparameter in this method, the variance $\sigma$ of the Gaussian noise, is often set manually or determined using a heuristic approach. This results in the smoothed gradients containing extra noise introduced by the smoothing process. In this paper, we aim to analyze the noise and its connection to the out-of-range sampling in the smoothing process of SmoothGrad. Based on this insight, we propose AdaptGrad, an adaptive gradient smoothing method that controls out-of-range sampling to minimize noise. Comprehensive experiments, both qualitative and quantitative, demonstrate that AdaptGrad could effectively reduce almost all the noise in vanilla gradients compared to baseline methods. AdaptGrad is simple and universal, making it a practical solution to enhance gradient-based interpretability methods to achieve clearer visualization.

Cite

Text

Zhou et al. "AdaptGrad: Adaptive Sampling to Reduce Noise." Advances in Neural Information Processing Systems, 2025.

Markdown

[Zhou et al. "AdaptGrad: Adaptive Sampling to Reduce Noise." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/zhou2025neurips-adaptgrad/)

BibTeX

@inproceedings{zhou2025neurips-adaptgrad,
  title     = {{AdaptGrad: Adaptive Sampling to Reduce Noise}},
  author    = {Zhou, Linjiang and Ma, Chao and Wang, Zepeng and Wu, Libing and Shi, Xiaochuan},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/zhou2025neurips-adaptgrad/}
}