ParZC: Parametric Zero-Cost Proxies for Efficient NAS
Abstract
Recent advancements in Zero-shot Neural Architecture Search (NAS) highlight the ability of zero-cost proxies in identifying superior architecture. However, we identify a critical issue with current zero-cost proxies: they aggregate node-wise zero-cost statistics without considering that not all nodes in a neural network equally impact performance estimation. Our observations reveal that node-wise zero-cost statistics significantly vary in their contributions to performance, with each node exhibiting a degree of uncertainty. Based on this insight, we introduce a novel method called Parametric Zero-Cost Proxies (ParZC) framework to enhance the adaptability of zero-cost proxies through parameterization. To address the node indiscrimination, we propose a Mixer Architecture with Bayesian Network (MABN) to explore the node-wise zero-cost statistics and estimate node-specific uncertainty. Moreover, we propose DiffKendall as a loss function to improve ranking consistency. Comprehensive experiments on NAS-Bench-101, 201, and NDS demonstrate the superiority of our proposed ParZC compared to existing zero-shot NAS methods. Additionally, we demonstrate the versatility and adaptability of ParZC on Vision Transformer search space.
Cite
Text
Dong et al. "ParZC: Parametric Zero-Cost Proxies for Efficient NAS." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I15.33793Markdown
[Dong et al. "ParZC: Parametric Zero-Cost Proxies for Efficient NAS." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/dong2025aaai-parzc/) doi:10.1609/AAAI.V39I15.33793BibTeX
@inproceedings{dong2025aaai-parzc,
title = {{ParZC: Parametric Zero-Cost Proxies for Efficient NAS}},
author = {Dong, Peijie and Li, Lujun and Tang, Zhenheng and Liu, Xiang and Wei, Zimian and Wang, Qiang and Chu, Xiaowen},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {16327-16335},
doi = {10.1609/AAAI.V39I15.33793},
url = {https://mlanthology.org/aaai/2025/dong2025aaai-parzc/}
}