Learning to Read Analog Gauges from Synthetic Data
Abstract
Manually reading and logging gauge data is time-inefficient, and the effort increases according to the number of gauges available. We present a pipeline that automates the reading of analog gauges. We propose a two-stage CNN pipeline that identifies the key structural components of an analog gauge and outputs an angular reading. To facilitate the training of our approach, a synthetic dataset is generated thus obtaining a set of realistic analog gauges with their corresponding annotation. To validate our proposal, an additional real-world dataset was collected with 4.813 manually curated images. When compared against state-of-the-art methodologies, our method shows a significant improvement of 4.55 in the average error, which is a 52% relative improvement. The resources for this project will be made available at: https://github.com/fuankarion/automatic-gauge-reading.
Cite
Text
Leon-Alcazar et al. "Learning to Read Analog Gauges from Synthetic Data." Winter Conference on Applications of Computer Vision, 2024.Markdown
[Leon-Alcazar et al. "Learning to Read Analog Gauges from Synthetic Data." Winter Conference on Applications of Computer Vision, 2024.](https://mlanthology.org/wacv/2024/leonalcazar2024wacv-learning/)BibTeX
@inproceedings{leonalcazar2024wacv-learning,
title = {{Learning to Read Analog Gauges from Synthetic Data}},
author = {Leon-Alcazar, Juan and Alnumay, Yazeed and Zheng, Cheng and Trigui, Hassane and Patel, Sahejad and Ghanem, Bernard},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2024},
pages = {8616-8625},
url = {https://mlanthology.org/wacv/2024/leonalcazar2024wacv-learning/}
}