Toward Learning to Press Doorbell Buttons
Abstract
To function in human-inhabited environments a robot must be able to press buttons. There are literally thousands of different buttons, which produce various types of feedback when pressed. This work focuses on doorbell buttons, which provide auditory feedback. Our robot learned to predict if a specific pushing movement would press a doorbell button and produce a sound. The robot explored different buttons with random pushing behaviors and perceived the proprioceptive, tactile, and acoustic outcomes of these behaviors.
Cite
Text
Wu et al. "Toward Learning to Press Doorbell Buttons." AAAI Conference on Artificial Intelligence, 2010. doi:10.1609/AAAI.V24I1.7790Markdown
[Wu et al. "Toward Learning to Press Doorbell Buttons." AAAI Conference on Artificial Intelligence, 2010.](https://mlanthology.org/aaai/2010/wu2010aaai-learning/) doi:10.1609/AAAI.V24I1.7790BibTeX
@inproceedings{wu2010aaai-learning,
title = {{Toward Learning to Press Doorbell Buttons}},
author = {Wu, Liping and Sukhoy, Vladimir and Stoytchev, Alexander},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2010},
pages = {1965-1966},
doi = {10.1609/AAAI.V24I1.7790},
url = {https://mlanthology.org/aaai/2010/wu2010aaai-learning/}
}