Towards Perceptual Interface for Visualization Navigation of Large Data Sets

Abstract

This paper presents a perceptual interface for visualization navigation using gesture recognition. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. Bezier curves are used for trajectory analysis and classification of gestures. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization is also presented. The paper advances the state-of-the-art of human-computer interaction with a robust attachment- and marker-free gestural information processing for visualization.

Cite

Text

Shin et al. "Towards Perceptual Interface for Visualization Navigation of Large Data Sets." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2003. doi:10.1109/CVPRW.2003.10045

Markdown

[Shin et al. "Towards Perceptual Interface for Visualization Navigation of Large Data Sets." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2003.](https://mlanthology.org/cvprw/2003/shin2003cvprw-perceptual/) doi:10.1109/CVPRW.2003.10045

BibTeX

@inproceedings{shin2003cvprw-perceptual,
  title     = {{Towards Perceptual Interface for Visualization Navigation of Large Data Sets}},
  author    = {Shin, Min C. and Tsap, Leonid V. and Goldgof, Dmitry B.},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2003},
  pages     = {48},
  doi       = {10.1109/CVPRW.2003.10045},
  url       = {https://mlanthology.org/cvprw/2003/shin2003cvprw-perceptual/}
}