Block-Quantized Kernel Matrix for Fast Spectral Embedding
Abstract
Eigendecomposition of kernel matrix is an indispensable procedure in many learning and vision tasks. However, the cubic complexity O(N3) is impractical for large problem, where N is the data size. In this paper, we propose an efficient approach to solve the eigendecomposition of the kernel matrix W. The idea is to approximate W with W̄ that is composed of m 2 constant blocks. The eigenvectors of W̄, which can be solved in O(m3) time, is then used to recover the eigenvectors of the original kernel matrix. The complexity of our method is only O(mN + m 3), which scales more favorably than state-of-the-art low rank approximation and sampling based approaches (O(m2N + m3)), and the approximation quality can be controlled conveniently. Our method demonstrates encouraging scaling behaviors in experiments of image segmentation (by spectral clustering) and kernel principal component analysis.
Cite
Text
Zhang and Kwok. "Block-Quantized Kernel Matrix for Fast Spectral Embedding." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143982Markdown
[Zhang and Kwok. "Block-Quantized Kernel Matrix for Fast Spectral Embedding." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/zhang2006icml-block/) doi:10.1145/1143844.1143982BibTeX
@inproceedings{zhang2006icml-block,
title = {{Block-Quantized Kernel Matrix for Fast Spectral Embedding}},
author = {Zhang, Kai and Kwok, James T.},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {1097-1104},
doi = {10.1145/1143844.1143982},
url = {https://mlanthology.org/icml/2006/zhang2006icml-block/}
}