Unlearning Personal Data from a Single Image
Abstract
Machine unlearning aims to erase data from a model as if the latter never saw them during training. While existing approaches unlearn information from complete or partial access to the training data, this access can be limited over time due to privacy regulations. Currently, no setting or benchmark exists to probe the effectiveness of unlearning methods in such scenarios. To fill this gap, we propose a novel task we call One-Shot Unlearning of Personal Identities (1-SHUI) that evaluates unlearning models when the training data is not available. We focus on unlearning identity data, which is specifically relevant due to current regulations requiring personal data deletion after training. To cope with data absence, we expect users to provide a portraiting picture to aid unlearning. We design requests on CelebA, CelebA-HQ, and MUFAC with different unlearning set sizes to evaluate applicable methods in 1-SHUI. Moreover, we propose MetaUnlearn, an effective method that meta-learns to forget identities from a single image. Our findings indicate that existing approaches struggle when data availability is limited, especially when there is a dissimilarity between the provided samples and the training data.
Cite
Text
De Min et al. "Unlearning Personal Data from a Single Image." Transactions on Machine Learning Research, 2025.Markdown
[De Min et al. "Unlearning Personal Data from a Single Image." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/min2025tmlr-unlearning/)BibTeX
@article{min2025tmlr-unlearning,
title = {{Unlearning Personal Data from a Single Image}},
author = {De Min, Thomas and Mancini, Massimiliano and Lathuilière, Stéphane and Roy, Subhankar and Ricci, Elisa},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/min2025tmlr-unlearning/}
}