Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements
Abstract
The primate brain must solve two important problems in grasping move(cid:173) ments. The first problem concerns the recognition of grasped objects: specifically, how does the brain integrate visual and motor information on a grasped object? The second problem concerns hand shape planning: specifically, how does the brain design the hand configuration suited to the shape of the object and the manipulation task? A neural network model that solves these problems has been developed. The operations of the net(cid:173) work are divided into a learning phase and an optimization phase. In the learning phase, internal representations, which depend on the grasped ob(cid:173) jects and the task, are acquired by integrating visual and somatosensory information. In the optimization phase, the most suitable hand shape for grasping an object is determined by using a relaxation computation of the network.
Cite
Text
Uno et al. "Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements." Neural Information Processing Systems, 1992.Markdown
[Uno et al. "Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements." Neural Information Processing Systems, 1992.](https://mlanthology.org/neurips/1992/uno1992neurips-integration/)BibTeX
@inproceedings{uno1992neurips-integration,
title = {{Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements}},
author = {Uno, Yoji and Fukumura, Naohiro and Suzuki, Ryoji and Kawato, Mitsuo},
booktitle = {Neural Information Processing Systems},
year = {1992},
pages = {311-318},
url = {https://mlanthology.org/neurips/1992/uno1992neurips-integration/}
}