Generalization Bounds with Logarithmic Negative-Sample Dependence for Adversarial Contrastive Learning
Abstract
Contrastive learning has emerged as a powerful unsupervised learning technique for extracting meaningful representations from unlabeled data by pulling similar data points closer in the representation space and pushing dissimilar ones apart. However, its vulnerability to adversarial attacks remains a critical challenge. To address this, adversarial contrastive learning — incorporating adversarial training into contrastive loss — has emerged as a promising approach to achieving robust representations that can withstand various adversarial attacks. While empirical evidence highlights its effectiveness, a comprehensive theoretical framework has been lacking. In this paper, we fill this gap by introducing generalization bounds for adversarial contrastive learning, offering key theoretical insights. Leveraging the Lipschitz continuity of loss functions, we derive generalization bounds that scale logarithmically with the number of negative samples, $K$, and apply to both linear and non-linear representations, including those obtained from deep neural networks (DNNs). Our theoretical results are supported by experiments on real-world datasets.
Cite
Text
Ghanooni et al. "Generalization Bounds with Logarithmic Negative-Sample Dependence for Adversarial Contrastive Learning." Transactions on Machine Learning Research, 2024.Markdown
[Ghanooni et al. "Generalization Bounds with Logarithmic Negative-Sample Dependence for Adversarial Contrastive Learning." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/ghanooni2024tmlr-generalization/)BibTeX
@article{ghanooni2024tmlr-generalization,
title = {{Generalization Bounds with Logarithmic Negative-Sample Dependence for Adversarial Contrastive Learning}},
author = {Ghanooni, Naghmeh and Mustafa, Waleed and Lei, Yunwen and Lin, Anthony Widjaja and Kloft, Marius},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/ghanooni2024tmlr-generalization/}
}