Fast and Adversarial Robust Kernelized SDU Learning

Abstract

SDU learning, a weakly supervised learning problem with only pairwise similarities, dissimilarities data points and unlabeled data available, has many practical applications. However, it is still lacking in defense against adversarial samples, and its learning process can be expensive. To address this gap, we propose a novel adversarial training framework for SDU learning. Our approach reformulates the conventional minimax problem as an equivalent minimization problem based on the kernel perspective, departing from traditional confrontational training methods. Additionally, we employ the random gradient method and random features to accelerate the training process. Theoretical analysis shows that our method can converge to a stationary point at a rate of $\mathcal{O}(1/T^{1/4})$. Our experimental results show that our algorithm is superior to other adversarial training methods in terms of generalization, efficiency and scalability against various adversarial attacks.

Cite

Text

Fan et al. "Fast and Adversarial Robust Kernelized SDU Learning." Artificial Intelligence and Statistics, 2024.

Markdown

[Fan et al. "Fast and Adversarial Robust Kernelized SDU Learning." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/fan2024aistats-fast/)

BibTeX

@inproceedings{fan2024aistats-fast,
  title     = {{Fast and Adversarial Robust Kernelized SDU Learning}},
  author    = {Fan, Yajing and Shi, Wanli and Chang, Yi and Gu, Bin},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2024},
  pages     = {1153-1161},
  volume    = {238},
  url       = {https://mlanthology.org/aistats/2024/fan2024aistats-fast/}
}