Manifold Regularization for SIR with Rate Root-N Convergence
Abstract
In this paper, we study the manifold regularization for the Sliced Inverse Regression (SIR). The manifold regularization improves the standard SIR in two aspects: 1) it encodes the local geometry for SIR and 2) it enables SIR to deal with transductive and semi-supervised learning problems. We prove that the proposed graph Laplacian based regularization is convergent at rate root-n. The projection directions of the regularized SIR are optimized by using a conjugate gradient method on the Grassmann manifold. Experimental results support our theory.
Cite
Text
Bian and Tao. "Manifold Regularization for SIR with Rate Root-N Convergence." Neural Information Processing Systems, 2009.Markdown
[Bian and Tao. "Manifold Regularization for SIR with Rate Root-N Convergence." Neural Information Processing Systems, 2009.](https://mlanthology.org/neurips/2009/bian2009neurips-manifold/)BibTeX
@inproceedings{bian2009neurips-manifold,
title = {{Manifold Regularization for SIR with Rate Root-N Convergence}},
author = {Bian, Wei and Tao, Dacheng},
booktitle = {Neural Information Processing Systems},
year = {2009},
pages = {117-125},
url = {https://mlanthology.org/neurips/2009/bian2009neurips-manifold/}
}