Improved Linear Embeddings via Lagrange Duality
Abstract
Near isometric orthogonal embeddings to lower dimensions are a fundamental tool in data science and machine learning. In this paper, we present the construction of such embeddings that minimizes the maximum distortion for a given set of points. We formulate the problem as a non convex constrained optimization problem. We first construct a primal relaxation and then use the theory of Lagrange duality to create a dual relaxation. We also suggest a polynomial time algorithm based on the theory of convex optimization to solve the dual relaxation provably. We provide a theoretical upper bound on the approximation guarantees for our algorithm, which depends only on the spectral properties of the dataset. We experimentally demonstrate the superiority of our algorithm compared to baselines in terms of the scalability and the ability to achieve lower distortion.
Cite
Text
Sheth et al. "Improved Linear Embeddings via Lagrange Duality." Machine Learning, 2019. doi:10.1007/S10994-018-5729-XMarkdown
[Sheth et al. "Improved Linear Embeddings via Lagrange Duality." Machine Learning, 2019.](https://mlanthology.org/mlj/2019/sheth2019mlj-improved/) doi:10.1007/S10994-018-5729-XBibTeX
@article{sheth2019mlj-improved,
title = {{Improved Linear Embeddings via Lagrange Duality}},
author = {Sheth, Kshiteej and Garg, Dinesh and Dasgupta, Anirban},
journal = {Machine Learning},
year = {2019},
pages = {575-594},
doi = {10.1007/S10994-018-5729-X},
volume = {108},
url = {https://mlanthology.org/mlj/2019/sheth2019mlj-improved/}
}