Optimizing over Radial Kernels on Compact Manifolds
Abstract
We tackle the problem of optimizing over all possible positive definite radial kernels on Riemannian manifolds for classification. Kernel methods on Riemannian manifolds have recently become increasingly popular in computer vision. However, the number of known positive definite kernels on manifolds remain very limited. Furthermore, most kernels typically depend on at least one parameter that needs to be tuned for the problem at hand. A poor choice of kernel, or of parameter value, may yield significant performance drop-off. Here, we show that positive definite radial kernels on the unit $n$-sphere, the Grassmann manifold and Kendall's shape manifold can be expressed in a simple form whose parameters can be automatically optimized within a support vector machine framework. We demonstrate the benefits of our kernel learning algorithm on object, face, action and shape recognition.
Cite
Text
Jayasumana et al. "Optimizing over Radial Kernels on Compact Manifolds." Conference on Computer Vision and Pattern Recognition, 2014. doi:10.1109/CVPR.2014.480Markdown
[Jayasumana et al. "Optimizing over Radial Kernels on Compact Manifolds." Conference on Computer Vision and Pattern Recognition, 2014.](https://mlanthology.org/cvpr/2014/jayasumana2014cvpr-optimizing/) doi:10.1109/CVPR.2014.480BibTeX
@inproceedings{jayasumana2014cvpr-optimizing,
title = {{Optimizing over Radial Kernels on Compact Manifolds}},
author = {Jayasumana, Sadeep and Hartley, Richard and Salzmann, Mathieu and Li, Hongdong and Harandi, Mehrtash},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2014},
doi = {10.1109/CVPR.2014.480},
url = {https://mlanthology.org/cvpr/2014/jayasumana2014cvpr-optimizing/}
}