Optimal Kernel Selection in Kernel Fisher Discriminant Analysis

Abstract

In Kernel Fisher discriminant analysis (KFDA), we carry out Fisher linear discriminant analysis in a high dimensional feature space defined implicitly by a kernel. The performance of KFDA depends on the choice of the kernel; in this paper, we consider the problem of finding the optimal kernel, over a given convex set of kernels. We show that this optimal kernel selection problem can be reformulated as a tractable convex optimization problem which interior-point methods can solve globally and efficiently. The kernel selection method is demonstrated with some UCI machine learning benchmark examples.

Cite

Text

Kim et al. "Optimal Kernel Selection in Kernel Fisher Discriminant Analysis." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143903

Markdown

[Kim et al. "Optimal Kernel Selection in Kernel Fisher Discriminant Analysis." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/kim2006icml-optimal/) doi:10.1145/1143844.1143903

BibTeX

@inproceedings{kim2006icml-optimal,
  title     = {{Optimal Kernel Selection in Kernel Fisher Discriminant Analysis}},
  author    = {Kim, Seung-Jean and Magnani, Alessandro and Boyd, Stephen P.},
  booktitle = {International Conference on Machine Learning},
  year      = {2006},
  pages     = {465-472},
  doi       = {10.1145/1143844.1143903},
  url       = {https://mlanthology.org/icml/2006/kim2006icml-optimal/}
}