Bridging Layout and RTL: Knowledge Distillation Based Timing Prediction
Abstract
Accurate and efficient timing prediction at the register-transfer level (RTL) remains a fundamental challenge in electronic design automation (EDA), particularly in striking a balance between accuracy and computational efficiency. While static timing analysis (STA) provides high-fidelity results through comprehensive physical parameters, its computational overhead makes it impractical for rapid design iterations. Conversely, existing RTL-level approaches sacrifice accuracy due to the limited physical information available. We propose RTLDistil, a novel cross-stage knowledge distillation framework that bridges this gap by transferring precise physical characteristics from a layout-aware teacher model (Teacher GNN) to an efficient RTL-level student model (Student GNN), both implemented as graph neural networks (GNNs). RTLDistil efficiently predicts key timing metrics, such as arrival time (AT), and employs a multi-granularity distillation strategy that captures timing-critical features at node, subgraph, and global levels. Experimental results demonstrate that RTLDistil achieves significant improvement in RTL-level timing prediction error reduction, compared to state-of-the-art prediction models. This framework enables accurate early-stage timing prediction, advancing EDA’s “left-shift” paradigm while maintaining computational efficiency. Our code and dataset will be publicly available at https://github.com/sklp-eda-lab/RTLDistil.
Cite
Text
Wang et al. "Bridging Layout and RTL: Knowledge Distillation Based Timing Prediction." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Wang et al. "Bridging Layout and RTL: Knowledge Distillation Based Timing Prediction." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/wang2025icml-bridging/)BibTeX
@inproceedings{wang2025icml-bridging,
title = {{Bridging Layout and RTL: Knowledge Distillation Based Timing Prediction}},
author = {Wang, Mingjun and Wen, Yihan and Sun, Bin and Mu, Jianan and Li, Juan and Wang, Xiaoyi and Ye, Jing Justin and Yu, Bei and Li, Huawei},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {64895-64913},
volume = {267},
url = {https://mlanthology.org/icml/2025/wang2025icml-bridging/}
}