Sublinear Quantum Algorithms for Training Linear and Kernel-Based Classifiers
Abstract
We investigate quantum algorithms for classification, a fundamental problem in machine learning, with provable guarantees. Given $n$ $d$-dimensional data points, the state-of-the-art (and optimal) classical algorithm for training classifiers with constant margin by Clarkson et al. runs in $\tilde{O}(n +d)$, which is also optimal in its input/output model. We design sublinear quantum algorithms for the same task running in $\tilde{O}(\sqrt{n} +\sqrt{d})$, a quadratic improvement in both $n$ and $d$. Moreover, our algorithms use the standard quantization of the classical input and generate the same classical output, suggesting minimal overheads when used as subroutines for end-to-end applications. We also demonstrate a tight lower bound (up to poly-log factors) and discuss the possibility of implementation on near-term quantum machines.
Cite
Text
Li et al. "Sublinear Quantum Algorithms for Training Linear and Kernel-Based Classifiers." International Conference on Machine Learning, 2019.Markdown
[Li et al. "Sublinear Quantum Algorithms for Training Linear and Kernel-Based Classifiers." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/li2019icml-sublinear/)BibTeX
@inproceedings{li2019icml-sublinear,
title = {{Sublinear Quantum Algorithms for Training Linear and Kernel-Based Classifiers}},
author = {Li, Tongyang and Chakrabarti, Shouvanik and Wu, Xiaodi},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {3815-3824},
volume = {97},
url = {https://mlanthology.org/icml/2019/li2019icml-sublinear/}
}