Divide-and-Conquer Learning with Nyström: Optimal Rate and Algorithm
Abstract
Kernel Regularized Least Squares (KRLS) is a fundamental learner in machine learning. However, due to the high time and space requirements, it has no capability to large scale scenarios. Therefore, we propose DC-NY, a novel algorithm that combines divide-and-conquer method, Nystrom, conjugate gradient, and preconditioning to scale up KRLS, has the same accuracy of exact KRLS and the minimum time and space complexity compared to the state-of-the-art approximate KRLS estimates. We present a theoretical analysis of DC-NY, including a novel error decomposition with the optimal statistical accuracy guarantees. Extensive experimental results on several real-world large-scale datasets containing up to 1M data points show that DC-NY significantly outperforms the state-of-the-art approximate KRLS estimates.
Cite
Text
Yin et al. "Divide-and-Conquer Learning with Nyström: Optimal Rate and Algorithm." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.6147Markdown
[Yin et al. "Divide-and-Conquer Learning with Nyström: Optimal Rate and Algorithm." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/yin2020aaai-divide/) doi:10.1609/AAAI.V34I04.6147BibTeX
@inproceedings{yin2020aaai-divide,
title = {{Divide-and-Conquer Learning with Nyström: Optimal Rate and Algorithm}},
author = {Yin, Rong and Liu, Yong and Lu, Lijing and Wang, Weiping and Meng, Dan},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {6696-6703},
doi = {10.1609/AAAI.V34I04.6147},
url = {https://mlanthology.org/aaai/2020/yin2020aaai-divide/}
}