Modelling Primate Control of Grasping for Robotics Applications
Abstract
The neural circuits that control grasping and perform related visual processing have been studied extensively in macaque monkeys. We are developing a computational model of this system, in order to better understand its function, and to explore applications to robotics. We recently modelled the neural representation of three-dimensional object shapes, and are currently extending the model to produce hand postures so that it can be tested on a robot. To train the extended model, we are developing a large database of object shapes and corresponding feasible grasps. Finally, further extensions are needed to account for the influence of higher-level goals on hand posture. This is essential because often the same object must be grasped in different ways for different purposes. The present paper focuses on a method of incorporating such higher-level goals. A proof-of-concept exhibits several important behaviours, such as choosing from multiple approaches to the same goal. Finally, we discuss a neural representation of objects that supports fast searching for analogous objects.
Cite
Text
Kleinhans et al. "Modelling Primate Control of Grasping for Robotics Applications." European Conference on Computer Vision Workshops, 2014. doi:10.1007/978-3-319-16181-5_33Markdown
[Kleinhans et al. "Modelling Primate Control of Grasping for Robotics Applications." European Conference on Computer Vision Workshops, 2014.](https://mlanthology.org/eccvw/2014/kleinhans2014eccvw-modelling/) doi:10.1007/978-3-319-16181-5_33BibTeX
@inproceedings{kleinhans2014eccvw-modelling,
title = {{Modelling Primate Control of Grasping for Robotics Applications}},
author = {Kleinhans, Ashley and Thill, Serge and Rosman, Benjamin and Detry, Renaud and Tripp, Bryan P.},
booktitle = {European Conference on Computer Vision Workshops},
year = {2014},
pages = {438-447},
doi = {10.1007/978-3-319-16181-5_33},
url = {https://mlanthology.org/eccvw/2014/kleinhans2014eccvw-modelling/}
}