Cosine Model Watermarking Against Ensemble Distillation
Abstract
Many model watermarking methods have been developed to prevent valuable deployed commercial models from being stealthily stolen by model distillations. However, watermarks produced by most existing model watermarking methods can be easily evaded by ensemble distillation, because averaging the outputs of multiple ensembled models can significantly reduce or even erase the watermarks. In this paper, we focus on tackling the challenging task of defending against ensemble distillation. We propose a novel watermarking technique named CosWM to achieve outstanding model watermarking performance against ensemble distillation. CosWM is not only elegant in design, but also comes with desirable theoretical guarantees. Our extensive experiments on public data sets demonstrate the excellent performance of CosWM and its advantages over the state-of-the-art baselines.
Cite
Text
Charette et al. "Cosine Model Watermarking Against Ensemble Distillation." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I9.21184Markdown
[Charette et al. "Cosine Model Watermarking Against Ensemble Distillation." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/charette2022aaai-cosine/) doi:10.1609/AAAI.V36I9.21184BibTeX
@inproceedings{charette2022aaai-cosine,
title = {{Cosine Model Watermarking Against Ensemble Distillation}},
author = {Charette, Laurent and Chu, Lingyang and Chen, Yizhou and Pei, Jian and Wang, Lanjun and Zhang, Yong},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {9512-9520},
doi = {10.1609/AAAI.V36I9.21184},
url = {https://mlanthology.org/aaai/2022/charette2022aaai-cosine/}
}