Regularizers Versus Losses for Nonlinear Dimensionality Reduction: A Factored View with New Convex Relaxations
Abstract
We demonstrate that almost all non-parametric dimensionality reduction methods can be expressed by a simple procedure: regularized loss minimization plus singular value truncation. By distinguishing the role of the loss and regularizer in such a process, we recover a factored perspective that reveals some gaps in the current literature. Beyond identifying a useful new loss for manifold unfolding, a key contribution is to derive new convex regularizers that combine distance maximization with rank reduction. These regularizers can be applied to any loss.
Cite
Text
Neufeld et al. "Regularizers Versus Losses for Nonlinear Dimensionality Reduction: A Factored View with New Convex Relaxations." International Conference on Machine Learning, 2012.Markdown
[Neufeld et al. "Regularizers Versus Losses for Nonlinear Dimensionality Reduction: A Factored View with New Convex Relaxations." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/neufeld2012icml-regularizers/)BibTeX
@inproceedings{neufeld2012icml-regularizers,
title = {{Regularizers Versus Losses for Nonlinear Dimensionality Reduction: A Factored View with New Convex Relaxations}},
author = {Neufeld, James and Yu, Yaoliang and Zhang, Xinhua and Kiros, Ryan and Schuurmans, Dale},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/neufeld2012icml-regularizers/}
}