On the Relationship Between Disentanglement and Multi-Task Learning

Abstract

One of the main arguments behind studying disentangled representations is the assumption that they can be easily reused in different tasks. At the same time finding a joint, adaptable representation of data is one of the key challenges in the multi-task learning setting. In this paper, we take a closer look at the relationship between disentanglement and multi-task learning based on hard parameter sharing. We perform a thorough empirical study of the representations obtained by neural networks trained on automatically generated supervised tasks. Using a set of standard metrics we show that disentanglement appears naturally during the process of multi-task neural network training.

Cite

Text

Maziarka et al. "On the Relationship Between Disentanglement and Multi-Task Learning." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022. doi:10.1007/978-3-031-26387-3_38

Markdown

[Maziarka et al. "On the Relationship Between Disentanglement and Multi-Task Learning." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022.](https://mlanthology.org/ecmlpkdd/2022/maziarka2022ecmlpkdd-relationship/) doi:10.1007/978-3-031-26387-3_38

BibTeX

@inproceedings{maziarka2022ecmlpkdd-relationship,
  title     = {{On the Relationship Between Disentanglement and Multi-Task Learning}},
  author    = {Maziarka, Lukasz and Nowak, Aleksandra and Wolczyk, Maciej and Bedychaj, Andrzej},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2022},
  pages     = {625-641},
  doi       = {10.1007/978-3-031-26387-3_38},
  url       = {https://mlanthology.org/ecmlpkdd/2022/maziarka2022ecmlpkdd-relationship/}
}