Continual Learning in Linear Classification on Separable Data
Abstract
We analyze continual learning on a sequence of separable linear classification tasks with binary labels. We show theoretically that learning with weak regularization reduces to solving a sequential max-margin problem, corresponding to a special case of the Projection Onto Convex Sets (POCS) framework. We then develop upper bounds on the forgetting and other quantities of interest under various settings with recurring tasks, including cyclic and random orderings of tasks. We discuss several practical implications to popular training practices like regularization scheduling and weighting. We point out several theoretical differences between our continual classification setting and a recently studied continual regression setting.
Cite
Text
Evron et al. "Continual Learning in Linear Classification on Separable Data." International Conference on Machine Learning, 2023.Markdown
[Evron et al. "Continual Learning in Linear Classification on Separable Data." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/evron2023icml-continual/)BibTeX
@inproceedings{evron2023icml-continual,
title = {{Continual Learning in Linear Classification on Separable Data}},
author = {Evron, Itay and Moroshko, Edward and Buzaglo, Gon and Khriesh, Maroun and Marjieh, Badea and Srebro, Nathan and Soudry, Daniel},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {9440-9484},
volume = {202},
url = {https://mlanthology.org/icml/2023/evron2023icml-continual/}
}