Learning Deep Representations with Probabilistic Knowledge Transfer

Abstract

Knowledge Transfer (KT) techniques tackle the problem of transferring the knowledge from a large and complex neural network into a smaller and faster one. However, existing KT methods are tailored towards classification tasks and they cannot be used efficiently for other representation learning tasks. In this paper we propose a novel probabilistic knowledge transfer method that works by matching the probability distribution of the data in the feature space instead of their actual representation. Apart from outperforming existing KT techniques, the proposed method allows for overcoming several of their limitations providing new insight into KT as well as novel KT applications, ranging from KT from handcrafted feature extractors to cross-modal KT from the textual modality into the representation extracted from the visual modality of the data.

Cite

Text

Passalis and Tefas. "Learning Deep Representations with Probabilistic Knowledge Transfer." Proceedings of the European Conference on Computer Vision (ECCV), 2018. doi:10.1007/978-3-030-01252-6_17

Markdown

[Passalis and Tefas. "Learning Deep Representations with Probabilistic Knowledge Transfer." Proceedings of the European Conference on Computer Vision (ECCV), 2018.](https://mlanthology.org/eccv/2018/passalis2018eccv-learning/) doi:10.1007/978-3-030-01252-6_17

BibTeX

@inproceedings{passalis2018eccv-learning,
  title     = {{Learning Deep Representations with Probabilistic Knowledge Transfer}},
  author    = {Passalis, Nikolaos and Tefas, Anastasios},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2018},
  doi       = {10.1007/978-3-030-01252-6_17},
  url       = {https://mlanthology.org/eccv/2018/passalis2018eccv-learning/}
}