LogME: Practical Assessment of Pre-Trained Models for Transfer Learning

Abstract

This paper studies task adaptive pre-trained model selection, an underexplored problem of assessing pre-trained models for the target task and select best ones from the model zoo \emph{without fine-tuning}. A few pilot works addressed the problem in transferring supervised pre-trained models to classification tasks, but they cannot handle emerging unsupervised pre-trained models or regression tasks. In pursuit of a practical assessment method, we propose to estimate the maximum value of label evidence given features extracted by pre-trained models. Unlike the maximum likelihood, the maximum evidence is \emph{immune to over-fitting}, while its expensive computation can be dramatically reduced by our carefully designed algorithm. The Logarithm of Maximum Evidence (LogME) can be used to assess pre-trained models for transfer learning: a pre-trained model with a high LogME value is likely to have good transfer performance. LogME is \emph{fast, accurate, and general}, characterizing itself as the first practical method for assessing pre-trained models. Compared with brute-force fine-tuning, LogME brings at most $3000\times$ speedup in wall-clock time and requires only $1%$ memory footprint. It outperforms prior methods by a large margin in their setting and is applicable to new settings. It is general enough for diverse pre-trained models (supervised pre-trained and unsupervised pre-trained), downstream tasks (classification and regression), and modalities (vision and language). Code is available at this repository: \href{https://github.com/thuml/LogME}https://github.com/thuml/LogME.

Cite

Text

You et al. "LogME: Practical Assessment of Pre-Trained Models for Transfer Learning." International Conference on Machine Learning, 2021.

Markdown

[You et al. "LogME: Practical Assessment of Pre-Trained Models for Transfer Learning." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/you2021icml-logme/)

BibTeX

@inproceedings{you2021icml-logme,
  title     = {{LogME: Practical Assessment of Pre-Trained Models for Transfer Learning}},
  author    = {You, Kaichao and Liu, Yong and Wang, Jianmin and Long, Mingsheng},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {12133-12143},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/you2021icml-logme/}
}