Towards Robust ResNet: A Small Step but a Giant Leap
Abstract
This paper presents a simple yet principled approach to boosting the robustness of the residual network (ResNet) that is motivated by a dynamical systems perspective. Namely, a deep neural network can be interpreted using a partial differential equation, which naturally inspires us to characterize ResNet based on an explicit Euler method. This consequently allows us to exploit the step factor h in the Euler method to control the robustness of ResNet in both its training and generalization. In particular, we prove that a small step factor h can benefit its training and generalization robustness during backpropagation and forward propagation, respectively. Empirical evaluation on real-world datasets corroborates our analytical findings that a small h can indeed improve both its training and generalization robustness.
Cite
Text
Zhang et al. "Towards Robust ResNet: A Small Step but a Giant Leap." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/595Markdown
[Zhang et al. "Towards Robust ResNet: A Small Step but a Giant Leap." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/zhang2019ijcai-robust/) doi:10.24963/IJCAI.2019/595BibTeX
@inproceedings{zhang2019ijcai-robust,
title = {{Towards Robust ResNet: A Small Step but a Giant Leap}},
author = {Zhang, Jingfeng and Han, Bo and Wynter, Laura and Low, Bryan Kian Hsiang and Kankanhalli, Mohan S.},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2019},
pages = {4285-4291},
doi = {10.24963/IJCAI.2019/595},
url = {https://mlanthology.org/ijcai/2019/zhang2019ijcai-robust/}
}