Sketch Recognition via Part-Based Hierarchical Analogical Learning
Abstract
Sketch recognition has been studied for decades, but it is far from solved. Drawing styles are highly variable across people and adapting to idiosyncratic visual expressions requires data-efficient learning. Explainability also matters, so that users can see why a system got confused about something. This paper introduces a novel part-based approach for sketch recognition, based on hierarchical analogical learning, a new method to apply analogical learning to qualitative representations. Given a sketched object, our system automatically segments it into parts and constructs multi-level qualitative representations of them. Our approach performs analogical generalization at multiple levels of part descriptions and uses coarse-grained results to guide interpretation at finer levels. Experiments on the Berlin TU dataset and the Coloring Book Objects dataset show that the system can learn explainable models in a data-efficient manner.
Cite
Text
Chen et al. "Sketch Recognition via Part-Based Hierarchical Analogical Learning." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/331Markdown
[Chen et al. "Sketch Recognition via Part-Based Hierarchical Analogical Learning." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/chen2023ijcai-sketch/) doi:10.24963/IJCAI.2023/331BibTeX
@inproceedings{chen2023ijcai-sketch,
title = {{Sketch Recognition via Part-Based Hierarchical Analogical Learning}},
author = {Chen, Kezhen and Forbus, Kenneth D. and Srinivasan, Balaji Vasan and Chhaya, Niyati and Usher, Madeline},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {2967-2974},
doi = {10.24963/IJCAI.2023/331},
url = {https://mlanthology.org/ijcai/2023/chen2023ijcai-sketch/}
}