Kernel Information Embeddings
Abstract
We describe a family of embedding algorithms that are based on nonparametric estimates of mutual information (MI). Using Parzen window estimates of the distribution in the joint (input, embedding)-space, we derive a MI-based objective function for dimensionality reduction that can be optimized directly with respect to a set of latent data representatives. Various types of supervision signal can be introduced within the framework by replacing plain MI with several forms of conditional MI. Examples of the semi-(un)supervised algorithms that we obtain this way are a new model for manifold alignment, and a new type of embedding method that performs 'conditional dimensionality reduction'.
Cite
Text
Memisevic. "Kernel Information Embeddings." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143924Markdown
[Memisevic. "Kernel Information Embeddings." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/memisevic2006icml-kernel/) doi:10.1145/1143844.1143924BibTeX
@inproceedings{memisevic2006icml-kernel,
title = {{Kernel Information Embeddings}},
author = {Memisevic, Roland},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {633-640},
doi = {10.1145/1143844.1143924},
url = {https://mlanthology.org/icml/2006/memisevic2006icml-kernel/}
}