Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation

Abstract

The goal of sufficient dimension reduction in supervised learning is to find the low dimensional subspace of input features that is "sufficient" for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squared-loss variant of mutual information as a dependency measure. We utilize an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches.

Cite

Text

Suzuki and Sugiyama. "Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.

Markdown

[Suzuki and Sugiyama. "Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.](https://mlanthology.org/aistats/2010/suzuki2010aistats-sufficient/)

BibTeX

@inproceedings{suzuki2010aistats-sufficient,
  title     = {{Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation}},
  author    = {Suzuki, Taiji and Sugiyama, Masashi},
  booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics},
  year      = {2010},
  pages     = {804-811},
  volume    = {9},
  url       = {https://mlanthology.org/aistats/2010/suzuki2010aistats-sufficient/}
}