Improving the Robustness of Deep Neural Networks via Adversarial Training with Triplet Loss

Abstract

Recent studies have highlighted that deep neural networks (DNNs) are vulnerable to adversarial examples. In this paper, we improve the robustness of DNNs by utilizing techniques of Distance Metric Learning. Specifically, we incorporate Triplet Loss, one of the most popular Distance Metric Learning methods, into the framework of adversarial training. Our proposed algorithm, Adversarial Training with Triplet Loss (AT2L), substitutes the adversarial example against the current model for the anchor of triplet loss to effectively smooth the classification boundary. Furthermore, we propose an ensemble version of AT2L, which aggregates different attack methods and model structures for better defense effects. Our empirical studies verify that the proposed approach can significantly improve the robustness of DNNs without sacrificing accuracy. Finally, we demonstrate that our specially designed triplet loss can also be used as a regularization term to enhance other defense methods.

Cite

Text

Li et al. "Improving the Robustness of Deep Neural Networks via Adversarial Training with Triplet Loss." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/403

Markdown

[Li et al. "Improving the Robustness of Deep Neural Networks via Adversarial Training with Triplet Loss." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/li2019ijcai-improving/) doi:10.24963/IJCAI.2019/403

BibTeX

@inproceedings{li2019ijcai-improving,
  title     = {{Improving the Robustness of Deep Neural Networks via Adversarial Training with Triplet Loss}},
  author    = {Li, Pengcheng and Yi, Jinfeng and Zhou, Bowen and Zhang, Lijun},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {2909-2915},
  doi       = {10.24963/IJCAI.2019/403},
  url       = {https://mlanthology.org/ijcai/2019/li2019ijcai-improving/}
}