Coordinate-Descent for Learning Orthogonal Matrices Through Givens Rotations
Abstract
Optimizing over the set of orthogonal matrices is a central component in problems like sparse-PCA or tensor decomposition. Unfortunately, such optimization is hard since simple operations on orthogonal matrices easily break orthogonality, and correcting orthogonality usually costs a large amount of computation. Here we propose a framework for optimizing orthogonal matrices, that is the parallel of coordinate-descent in Euclidean spaces. It is based on \em Givens-rotations, a fast-to-compute operation that affects a small number of entries in the learned matrix, and preserves orthogonality. We show two applications of this approach: an algorithm for tensor decompositions used in learning mixture models, and an algorithm for sparse-PCA. We study the parameter regime where a Givens rotation approach converges faster and achieves a superior model on a genome-wide brain-wide mRNA expression dataset.
Cite
Text
Shalit and Chechik. "Coordinate-Descent for Learning Orthogonal Matrices Through Givens Rotations." International Conference on Machine Learning, 2014.Markdown
[Shalit and Chechik. "Coordinate-Descent for Learning Orthogonal Matrices Through Givens Rotations." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/shalit2014icml-coordinatedescent/)BibTeX
@inproceedings{shalit2014icml-coordinatedescent,
title = {{Coordinate-Descent for Learning Orthogonal Matrices Through Givens Rotations}},
author = {Shalit, Uri and Chechik, Gal},
booktitle = {International Conference on Machine Learning},
year = {2014},
pages = {548-556},
volume = {32},
url = {https://mlanthology.org/icml/2014/shalit2014icml-coordinatedescent/}
}