Semi-Supervised Regression Using Hessian Energy with an Application to Semi-Supervised Dimensionality Reduction
Abstract
Semi-supervised regression based on the graph Laplacian suffers from the fact that the solution is biased towards a constant and the lack of extrapolating power. Outgoing from these observations we propose to use the second-order Hessian energy for semi-supervised regression which overcomes both of these problems, in particular, if the data lies on or close to a low-dimensional submanifold in the feature space, the Hessian energy prefers functions which vary ``linearly with respect to the natural parameters in the data. This property makes it also particularly suited for the task of semi-supervised dimensionality reduction where the goal is to find the natural parameters in the data based on a few labeled points. The experimental result suggest that our method is superior to semi-supervised regression using Laplacian regularization and standard supervised methods and is particularly suited for semi-supervised dimensionality reduction.
Cite
Text
Kim et al. "Semi-Supervised Regression Using Hessian Energy with an Application to Semi-Supervised Dimensionality Reduction." Neural Information Processing Systems, 2009.Markdown
[Kim et al. "Semi-Supervised Regression Using Hessian Energy with an Application to Semi-Supervised Dimensionality Reduction." Neural Information Processing Systems, 2009.](https://mlanthology.org/neurips/2009/kim2009neurips-semisupervised/)BibTeX
@inproceedings{kim2009neurips-semisupervised,
title = {{Semi-Supervised Regression Using Hessian Energy with an Application to Semi-Supervised Dimensionality Reduction}},
author = {Kim, Kwang I. and Steinke, Florian and Hein, Matthias},
booktitle = {Neural Information Processing Systems},
year = {2009},
pages = {979-987},
url = {https://mlanthology.org/neurips/2009/kim2009neurips-semisupervised/}
}