A Scaling Law for Syn2real Transfer: How Much Is Your Pre-Training Effective?
Abstract
Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks. The most significant advantage of using synthetic images is that the ground-truth labels are automatically available, enabling unlimited expansion of the data size without human cost. However, synthetic data may have a huge domain gap, in which case increasing the data size does not improve the performance. How can we know that? In this study, we derive a simple scaling law that predicts the performance from the amount of pre-training data. By estimating the parameters of the law, we can judge whether we should increase the data or change the setting of image synthesis. Further, we analyze the theory of transfer learning by considering learning dynamics and confirm that the derived generalization bound is consistent with our empirical findings. We empirically validated our scaling law on various experimental settings of benchmark tasks, model sizes, and complexities of synthetic images.
Cite
Text
Mikami et al. "A Scaling Law for Syn2real Transfer: How Much Is Your Pre-Training Effective?." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022. doi:10.1007/978-3-031-26409-2_29Markdown
[Mikami et al. "A Scaling Law for Syn2real Transfer: How Much Is Your Pre-Training Effective?." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022.](https://mlanthology.org/ecmlpkdd/2022/mikami2022ecmlpkdd-scaling/) doi:10.1007/978-3-031-26409-2_29BibTeX
@inproceedings{mikami2022ecmlpkdd-scaling,
title = {{A Scaling Law for Syn2real Transfer: How Much Is Your Pre-Training Effective?}},
author = {Mikami, Hiroaki and Fukumizu, Kenji and Murai, Shogo and Suzuki, Shuji and Kikuchi, Yuta and Suzuki, Taiji and Maeda, Shin-ichi and Hayashi, Kohei},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2022},
pages = {477-492},
doi = {10.1007/978-3-031-26409-2_29},
url = {https://mlanthology.org/ecmlpkdd/2022/mikami2022ecmlpkdd-scaling/}
}