Partial Trace Regression and Low-Rank Kraus Decomposition
Abstract
The trace regression model, a direct extension of the well-studied linear regression model, allows one to map matrices to real-valued outputs. We here introduce an even more general model, namely the partial-trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps. We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.
Cite
Text
Kadri et al. "Partial Trace Regression and Low-Rank Kraus Decomposition." International Conference on Machine Learning, 2020.Markdown
[Kadri et al. "Partial Trace Regression and Low-Rank Kraus Decomposition." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/kadri2020icml-partial/)BibTeX
@inproceedings{kadri2020icml-partial,
title = {{Partial Trace Regression and Low-Rank Kraus Decomposition}},
author = {Kadri, Hachem and Ayache, Stephane and Huusari, Riikka and Rakotomamonjy, Alain and Liva, Ralaivola},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {5031-5041},
volume = {119},
url = {https://mlanthology.org/icml/2020/kadri2020icml-partial/}
}