Implicit Regularization with Polynomial Growth in Deep Tensor Factorization
Abstract
We study the implicit regularization effects of deep learning in tensor factorization. While implicit regularization in deep matrix and ’shallow’ tensor factorization via linear and certain type of non-linear neural networks promotes low-rank solutions with at most quadratic growth, we show that its effect in deep tensor factorization grows polynomially with the depth of the network. This provides a remarkably faithful description of the observed experimental behaviour. Using numerical experiments, we demonstrate the benefits of this implicit regularization in yielding a more accurate estimation and better convergence properties.
Cite
Text
Hariz et al. "Implicit Regularization with Polynomial Growth in Deep Tensor Factorization." International Conference on Machine Learning, 2022.Markdown
[Hariz et al. "Implicit Regularization with Polynomial Growth in Deep Tensor Factorization." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/hariz2022icml-implicit/)BibTeX
@inproceedings{hariz2022icml-implicit,
title = {{Implicit Regularization with Polynomial Growth in Deep Tensor Factorization}},
author = {Hariz, Kais and Kadri, Hachem and Ayache, Stephane and Moakher, Maher and Artieres, Thierry},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {8484-8501},
volume = {162},
url = {https://mlanthology.org/icml/2022/hariz2022icml-implicit/}
}