Image Shortcut Squeezing: Countering Perturbative Availability Poisons with Compression

Abstract

Perturbative availability poisoning (PAP) adds small changes to images to prevent their use for model training. Current research adopts the belief that practical and effective approaches to countering such poisons do not exist. In this paper, we argue that it is time to abandon this belief. We present extensive experiments showing that 12 state-of-the-art PAP methods are vulnerable to Image Shortcut Squeezing (ISS), which is based on simple compression. For example, on average, ISS restores the CIFAR-10 model accuracy to 81.73%, surpassing the previous best preprocessing-based countermeasures by 37.97% absolute. ISS also (slightly) outperforms adversarial training and has higher generalizability to unseen perturbation norms and also higher efficiency. Our investigation reveals that the property of PAP perturbations depends on the type of surrogate model used for poison generation, and it explains why a specific ISS compression yields the best performance for a specific type of PAP perturbation. We further test stronger, adaptive poisoning, and show it falls short of being an ideal defense against ISS. Overall, our results demonstrate the importance of considering various (simple) countermeasures to ensure the meaningfulness of analysis carried out during the development of availability poisons.

Cite

Text

Liu et al. "Image Shortcut Squeezing: Countering Perturbative Availability Poisons with Compression." International Conference on Machine Learning, 2023.

Markdown

[Liu et al. "Image Shortcut Squeezing: Countering Perturbative Availability Poisons with Compression." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/liu2023icml-image/)

BibTeX

@inproceedings{liu2023icml-image,
  title     = {{Image Shortcut Squeezing: Countering Perturbative Availability Poisons with Compression}},
  author    = {Liu, Zhuoran and Zhao, Zhengyu and Larson, Martha},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {22473-22487},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/liu2023icml-image/}
}