Kernelizing Sorting, Permutation, and Alignment for Minimum Volume PCA

Abstract

We propose an algorithm for permuting or sorting multiple sets (or bags) of objects such that they can ultimately be represented efficiently using kernel principal component analysis. This framework generalizes sorting from scalars to arbitrary inputs since all computations involve inner products which can be done in Hilbert space and kernelized. The cost function on the permutations or orderings emerges from a maximum likelihood Gaussian solution which approximately minimizes the volume data occupies in Hilbert space. This ensures that few kernel principal components are necessary to capture the variation of the sets or bags. Both global and almost-global iterative solutions are provided in terms of iterative algorithms by interleaving variational bounding (on quadratic assignment problems) with a Kuhn-Munkres algorithm (for solving linear assignment problems).

Cite

Text

Jebara. "Kernelizing Sorting, Permutation, and Alignment for Minimum Volume PCA." Annual Conference on Computational Learning Theory, 2004. doi:10.1007/978-3-540-27819-1_42

Markdown

[Jebara. "Kernelizing Sorting, Permutation, and Alignment for Minimum Volume PCA." Annual Conference on Computational Learning Theory, 2004.](https://mlanthology.org/colt/2004/jebara2004colt-kernelizing/) doi:10.1007/978-3-540-27819-1_42

BibTeX

@inproceedings{jebara2004colt-kernelizing,
  title     = {{Kernelizing Sorting, Permutation, and Alignment for Minimum Volume PCA}},
  author    = {Jebara, Tony},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2004},
  pages     = {609-623},
  doi       = {10.1007/978-3-540-27819-1_42},
  url       = {https://mlanthology.org/colt/2004/jebara2004colt-kernelizing/}
}