Modeling embodied visual behaviors
Autor: | Al Robinson, Dana H. Ballard, Nathan Sprague |
---|---|
Rok vydání: | 2007 |
Předmět: |
General Computer Science
business.industry Computer science Experimental and Cognitive Psychology Theoretical Computer Science Human–computer interaction Embodied cognition Reinforcement learning Visual attention The Internet Artificial intelligence Graphics business Set (psychology) Level of detail |
Zdroj: | ACM Transactions on Applied Perception. 4:11 |
ISSN: | 1544-3965 1544-3558 |
DOI: | 10.1145/1265957.1265960 |
Popis: | To make progess in understanding human visuomotor behavior, we will need to understand its basic components at an abstract level. One way to achieve such an understanding would be to create a model of a human that has a sufficient amount of complexity so as to be capable of generating such behaviors. Recent technological advances have been made that allow progress to be made in this direction. Graphics models that simulate extensive human capabilities can be used as platforms from which to develop synthetic models of visuomotor behavior. Currently, such models can capture only a small portion of a full behavioral repertoire, but for the behaviors that they do model, they can describe complete visuomotor subsystems at a useful level of detail. The value in doing so is that the body's elaborate visuomotor structures greatly simplify the specification of the abstract behaviors that guide them. The net result is that, essentially, one is faced with proposing an embodied “operating system” model for picking the right set of abstract behaviors at each instant. This paper outlines one such model. A centerpiece of the model uses vision to aid the behavior that has the most to gain from taking environmental measurements. Preliminary tests of the model against human performance in realistic VR environments show that main features of the model show up in human behavior. |
Databáze: | OpenAIRE |
Externí odkaz: |