Memorisation in Machine Learning: A Survey of Results

Abstract

Quantifying the impact of individual data samples on machine learning models is an open research problem. This is particularly relevant when complex and high-dimensional relationships have to be learned from a limited sample of the data generating distribution, such as in deep learning. It was previously shown that, in these cases, models rely not only on extracting patterns which are helpful for generalisation, but also seem to be required to incorporate some of the training data more or less as is, in a process often termed memorisation. This raises the question: if some memorisation is a requirement for effective learning, what are its privacy implications? In this work we consider a broad range of previous definitions and perspectives on memorisation in ML, discuss their interplay with model generalisation and their implications of these phenomena on data privacy. We then propose a framework to reason over what memorisation means in the context of ML training under the prism of individual sample's influence on the model. Moreover, we systematise methods allowing practitioners to detect the occurrence of memorisation or quantify it and contextualise our findings in a broad range of ML learning settings. Finally, we discuss memorisation in the context of privacy attacks, differential privacy and adversarial actors.

Cite

Text

Usynin et al. "Memorisation in Machine Learning: A Survey of Results." Transactions on Machine Learning Research, 2024.

Markdown

[Usynin et al. "Memorisation in Machine Learning: A Survey of Results." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/usynin2024tmlr-memorisation/)

BibTeX

@article{usynin2024tmlr-memorisation,
  title     = {{Memorisation in Machine Learning: A Survey of Results}},
  author    = {Usynin, Dmitrii and Knolle, Moritz and Kaissis, Georgios},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/usynin2024tmlr-memorisation/}
}