Visual Routines and Visual Search: A Real-Time Implementation and an Automata-Theoretic Analysis
Abstract
I describe a real-time implementation of Ullman 's visual routine processor (VRP) theory of intermediate vision for visual search. The system performs serial self-terminating visual search and computes 2D spatial relations of objects from live color video using low cost hardware. I present a formal model of a VRP with unbounded resources and quantify the amount of external control structure required to solve Horn clauses using the VRP. In discussing the effect of resource limitations I show that contemporary models of biological visual attention are unable to solve surprisingly simple queries. I also describe a novel logic programming system that finds satisfying variable assignments for Horn clause queries using the VRP. The system contains no internal database: all logic variables are directly grounded in the world using VRP queries. Finally, I briefly discuss experiments with natural language interpretation and motor control using the VRP. Experiments on real data are given. 1 1 In...
Cite
Text
Horswill. "Visual Routines and Visual Search: A Real-Time Implementation and an Automata-Theoretic Analysis." International Joint Conference on Artificial Intelligence, 1995.Markdown
[Horswill. "Visual Routines and Visual Search: A Real-Time Implementation and an Automata-Theoretic Analysis." International Joint Conference on Artificial Intelligence, 1995.](https://mlanthology.org/ijcai/1995/horswill1995ijcai-visual/)BibTeX
@inproceedings{horswill1995ijcai-visual,
title = {{Visual Routines and Visual Search: A Real-Time Implementation and an Automata-Theoretic Analysis}},
author = {Horswill, Ian},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {1995},
pages = {56-63},
url = {https://mlanthology.org/ijcai/1995/horswill1995ijcai-visual/}
}