PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces
Abstract
The recent rise in popularity of Hyperparameter Optimization (HPO) for deep learning has highlighted the role that good hyperparameter (HP) space design can play in training strong models. In turn, designing a good HP space is critically dependent on understanding the role of different HPs. This motivates research on HP Importance (HPI), e.g., with the popular method of functional ANOVA (f-ANOVA). However, the original f-ANOVA formulation is inapplicable to the subspaces most relevant to algorithm designers, such as those defined by top performance. To overcome this issue, we derive a novel formulation of f-ANOVA for arbitrary subspaces and propose an algorithm that uses Pearson divergence (PED) to enable a closed-form calculation of HPI. We demonstrate that this new algorithm, dubbed PED-ANOVA, is able to successfully identify important HPs in different subspaces while also being extremely computationally efficient. See https://arxiv.org/abs/2304.10255 for the latest version with Appendix.
Cite
Text
Watanabe et al. "PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/488Markdown
[Watanabe et al. "PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/watanabe2023ijcai-ped/) doi:10.24963/IJCAI.2023/488BibTeX
@inproceedings{watanabe2023ijcai-ped,
title = {{PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces}},
author = {Watanabe, Shuhei and Bansal, Archit and Hutter, Frank},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {4389-4396},
doi = {10.24963/IJCAI.2023/488},
url = {https://mlanthology.org/ijcai/2023/watanabe2023ijcai-ped/}
}