On Consequences of Finetuning on Data with Highly Discriminative Features

Abstract

In the era of transfer learning, training neural networks from scratch is becoming obsolete. Transfer learning leverages prior knowledge for new tasks, conserving computational resources. While its advantages are well-documented, we uncover a notable drawback: networks tend to prioritize basic data patterns, forsaking valuable pre-learned features. We term this behavior "feature erosion" and analyze its impact on network performance and internal representations.

Cite

Text

Masarczyk et al. "On Consequences of Finetuning on Data with Highly Discriminative Features." NeurIPS 2023 Workshops: UniReps, 2023.

Markdown

[Masarczyk et al. "On Consequences of Finetuning on Data with Highly Discriminative Features." NeurIPS 2023 Workshops: UniReps, 2023.](https://mlanthology.org/neuripsw/2023/masarczyk2023neuripsw-consequences/)

BibTeX

@inproceedings{masarczyk2023neuripsw-consequences,
  title     = {{On Consequences of Finetuning on Data with Highly Discriminative Features}},
  author    = {Masarczyk, Wojciech and Trzcinski, Tomasz and Ostaszewski, Mateusz},
  booktitle = {NeurIPS 2023 Workshops: UniReps},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/masarczyk2023neuripsw-consequences/}
}