A Scalable and Efficient Iterative Method for Copying Machine Learning Classifiers
Abstract
Differential replication through copying refers to the process of replicating the decision behavior of a machine learning model using another model that possesses enhanced features and attributes. This process is relevant when external constraints limit the performance of an industrial predictive system. Under such circumstances, copying enables the retention of original prediction capabilities while adapting to new demands. Previous research has focused on the single-pass implementation for copying. This paper introduces a novel sequential approach that significantly reduces the amount of computational resources needed to train or maintain a copy, leading to reduced maintenance costs for companies using machine learning models in production. The effectiveness of the sequential approach is demonstrated through experiments with synthetic and real-world datasets, showing significant reductions in time and resources, while maintaining or improving accuracy.
Cite
Text
Statuto et al. "A Scalable and Efficient Iterative Method for Copying Machine Learning Classifiers." Journal of Machine Learning Research, 2023.Markdown
[Statuto et al. "A Scalable and Efficient Iterative Method for Copying Machine Learning Classifiers." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/statuto2023jmlr-scalable/)BibTeX
@article{statuto2023jmlr-scalable,
title = {{A Scalable and Efficient Iterative Method for Copying Machine Learning Classifiers}},
author = {Statuto, Nahuel and Unceta, Irene and Nin, Jordi and Pujol, Oriol},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-34},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/statuto2023jmlr-scalable/}
}