Learning Meta-Features for AutoML
Abstract
This paper tackles the AutoML problem, aimed to automatically select an ML algorithm and its hyper-parameter configuration most appropriate to the dataset at hand. The proposed approach, MetaBu, learns new meta-features via an Optimal Transport procedure, aligning the manually designed \mf s with the space of distributions on the hyper-parameter configurations. MetaBu meta-features, learned once and for all, induce a topology on the set of datasets that is exploited to define a distribution of promising hyper-parameter configurations amenable to AutoML. Experiments on the OpenML CC-18 benchmark demonstrate that using MetaBu meta-features boosts the performance of state of the art AutoML systems, AutoSklearn (Feurer et al. 2015) and Probabilistic Matrix Factorization (Fusi et al. 2018). Furthermore, the inspection of MetaBu meta-features gives some hints into when an ML algorithm does well. Finally, the topology based on MetaBu meta-features enables to estimate the intrinsic dimensionality of the OpenML benchmark w.r.t. a given ML algorithm or pipeline. The source code is available at https://github.com/luxusg1/metabu.
Cite
Text
Rakotoarison et al. "Learning Meta-Features for AutoML." International Conference on Learning Representations, 2022.Markdown
[Rakotoarison et al. "Learning Meta-Features for AutoML." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/rakotoarison2022iclr-learning/)BibTeX
@inproceedings{rakotoarison2022iclr-learning,
title = {{Learning Meta-Features for AutoML}},
author = {Rakotoarison, Herilalaina and Milijaona, Louisot and Rasoanaivo, Andry and Sebag, Michele and Schoenauer, Marc},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/rakotoarison2022iclr-learning/}
}