Input and Weight Space Smoothing for Semi-Supervised Learning
Abstract
We propose regularizing the empirical loss for semi-supervised learning by acting on both the input (data) space, and the weight (parameter) space. We propose a method to perform such smoothing, which combines known input-space smoothing with a novel weight-space smoothing, based on a min-max (adversarial) optimization. The resulting Adversarial Block Coordinate Descent (ABCD) algorithm performs gradient ascent with a small learning rate for a random subset of the weights, and standard gradient descent on the remaining weights in the same mini-batch. It is simple to implement and achieves state-of-the-art performance.
Cite
Text
Cicek and Soatto. "Input and Weight Space Smoothing for Semi-Supervised Learning." IEEE/CVF International Conference on Computer Vision Workshops, 2019. doi:10.1109/ICCVW.2019.00170Markdown
[Cicek and Soatto. "Input and Weight Space Smoothing for Semi-Supervised Learning." IEEE/CVF International Conference on Computer Vision Workshops, 2019.](https://mlanthology.org/iccvw/2019/cicek2019iccvw-input/) doi:10.1109/ICCVW.2019.00170BibTeX
@inproceedings{cicek2019iccvw-input,
title = {{Input and Weight Space Smoothing for Semi-Supervised Learning}},
author = {Cicek, Safa and Soatto, Stefano},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2019},
pages = {1344-1353},
doi = {10.1109/ICCVW.2019.00170},
url = {https://mlanthology.org/iccvw/2019/cicek2019iccvw-input/}
}