Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information
Abstract
The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches.
Cite
Text
Yamada et al. "Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information." Proceedings of the Third Asian Conference on Machine Learning, 2011.Markdown
[Yamada et al. "Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information." Proceedings of the Third Asian Conference on Machine Learning, 2011.](https://mlanthology.org/acml/2011/yamada2011acml-computationally/)BibTeX
@inproceedings{yamada2011acml-computationally,
title = {{Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information}},
author = {Yamada, Makoto and Niu, Gang and Takagi, Jun and Sugiyama, Masashi},
booktitle = {Proceedings of the Third Asian Conference on Machine Learning},
year = {2011},
pages = {247-262},
volume = {20},
url = {https://mlanthology.org/acml/2011/yamada2011acml-computationally/}
}