Boosting Adversarial Training via Fisher-Rao Norm-Based Regularization

Abstract

Adversarial training is extensively utilized to improve the adversarial robustness of deep neural networks. Yet mitigating the degradation of standard generalization performance in adversarial-trained models remains an open problem. This paper attempts to resolve this issue through the lens of model complexity. First We leverage the Fisher-Rao norm a geometrically invariant metric for model complexity to establish the non-trivial bounds of the Cross-Entropy Loss-based Rademacher complexity for a ReLU-activated Multi-Layer Perceptron. Building upon this observation we propose a novel regularization framework called Logit-Oriented Adversarial Training (LOAT) which can mitigate the trade-off between robustness and accuracy while imposing only a negligible increase in computational overhead. Our extensive experiments demonstrate that the proposed regularization strategy can boost the performance of the prevalent adversarial training algorithms including PGD-AT TRADES TRADES (LSE) MART and DM-AT across various network architectures. Our code will be available at https://github.com/TrustAI/LOAT.

Cite

Text

Yin and Ruan. "Boosting Adversarial Training via Fisher-Rao Norm-Based Regularization." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.02317

Markdown

[Yin and Ruan. "Boosting Adversarial Training via Fisher-Rao Norm-Based Regularization." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/yin2024cvpr-boosting/) doi:10.1109/CVPR52733.2024.02317

BibTeX

@inproceedings{yin2024cvpr-boosting,
  title     = {{Boosting Adversarial Training via Fisher-Rao Norm-Based Regularization}},
  author    = {Yin, Xiangyu and Ruan, Wenjie},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2024},
  pages     = {24544-24553},
  doi       = {10.1109/CVPR52733.2024.02317},
  url       = {https://mlanthology.org/cvpr/2024/yin2024cvpr-boosting/}
}