Symmetry reCAPTCHA
Abstract
This is a reaction to the poor performance of symmetry detection algorithms on real-world images, benchmarked since CVPR 2011. Our systematic study reveals significant difference between human labeled (reflection and rotation) symmetries on photos and the output of computer vision algorithms on the same photo set. We exploit this human-machine symmetry perception gap by proposing a novel symmetry-based Turing test. By leveraging a comprehensive user interface, we collected more than 78,000 symmetry labels from 400 Amazon Mechanical Turk raters on nearly 1,000 photos from the Microsoft COCO dataset. Using a set of ground-truth symmetries automatically generated from noisy human labels, the effectiveness of our work is evidenced by a separate test where over 96% success rate is achieved. We demonstrate statistically significant outcomes for using symmetry perception as a powerful, alternative, image-based reCAPTCHA.
Cite
Text
Funk and Liu. "Symmetry reCAPTCHA." Conference on Computer Vision and Pattern Recognition, 2016. doi:10.1109/CVPR.2016.558Markdown
[Funk and Liu. "Symmetry reCAPTCHA." Conference on Computer Vision and Pattern Recognition, 2016.](https://mlanthology.org/cvpr/2016/funk2016cvpr-symmetry/) doi:10.1109/CVPR.2016.558BibTeX
@inproceedings{funk2016cvpr-symmetry,
title = {{Symmetry reCAPTCHA}},
author = {Funk, Chris and Liu, Yanxi},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2016},
doi = {10.1109/CVPR.2016.558},
url = {https://mlanthology.org/cvpr/2016/funk2016cvpr-symmetry/}
}