A Convex Formulation for Learning Shared Structures from Multiple Tasks

Abstract

Multi-task learning (MTL) aims to improve generalization performance by learning multiple related tasks simultaneously. In this paper, we consider the problem of learning shared structures from multiple related tasks. We present an improved formulation (iASO) for multi-task learning based on the non-convex alternating structure optimization (ASO) algorithm, in which all tasks are related by a shared feature representation. We convert iASO, a non-convex formulation, into a relaxed convex one, which is, however, not scalable to large data sets due to its complex constraints. We propose an alternating optimization (cASO) algorithm which solves the convex relaxation efficiently, and further show that cASO converges to a global optimum. In addition, we present a theoretical condition, under which cASO can find a globally optimal solution to iASO. Experiments on several benchmark data sets confirm our theoretical analysis.

Cite

Text

Chen et al. "A Convex Formulation for Learning Shared Structures from Multiple Tasks." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553392

Markdown

[Chen et al. "A Convex Formulation for Learning Shared Structures from Multiple Tasks." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/chen2009icml-convex/) doi:10.1145/1553374.1553392

BibTeX

@inproceedings{chen2009icml-convex,
  title     = {{A Convex Formulation for Learning Shared Structures from Multiple Tasks}},
  author    = {Chen, Jianhui and Tang, Lei and Liu, Jun and Ye, Jieping},
  booktitle = {International Conference on Machine Learning},
  year      = {2009},
  pages     = {137-144},
  doi       = {10.1145/1553374.1553392},
  url       = {https://mlanthology.org/icml/2009/chen2009icml-convex/}
}