Constructing Nonlinear Discriminants from Multiple Data Views
Abstract
There are many situations in which we have more than one view of a single data source, or in which we have multiple sources of data that are aligned. We would like to be able to build classifiers which incorporate these to enhance classification performance. Kernel Fisher Discriminant Analysis (KFDA) can be formulated as a convex optimisation problem, which we extend to the Multiview setting (MFDA) and introduce a sparse version (SMFDA). We show that our formulations are justified from both probabilistic and learning theory perspectives. We then extend the optimisation problem to account for directions unique to each view (PMFDA). We show experimental validation on a toy dataset, and then give experimental results on a brain imaging dataset and part of the PASCAL 2007 VOC challenge dataset.
Cite
Text
Diethe et al. "Constructing Nonlinear Discriminants from Multiple Data Views." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2010. doi:10.1007/978-3-642-15880-3_27Markdown
[Diethe et al. "Constructing Nonlinear Discriminants from Multiple Data Views." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2010.](https://mlanthology.org/ecmlpkdd/2010/diethe2010ecmlpkdd-constructing/) doi:10.1007/978-3-642-15880-3_27BibTeX
@inproceedings{diethe2010ecmlpkdd-constructing,
title = {{Constructing Nonlinear Discriminants from Multiple Data Views}},
author = {Diethe, Tom and Hardoon, David R. and Shawe-Taylor, John},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2010},
pages = {328-343},
doi = {10.1007/978-3-642-15880-3_27},
url = {https://mlanthology.org/ecmlpkdd/2010/diethe2010ecmlpkdd-constructing/}
}