Autor: |
Kopp, Stefan, Tepper, P., Striegnitz, K., Ferriman, K., Cassell, J., Nishida, Toyoaki |
Jazyk: |
angličtina |
Rok vydání: |
2007 |
Předmět: |
|
Zdroj: |
Conversational Informatics: An Engineering Approach |
DOI: |
10.1002/9780470512470.ch8 |
Popis: |
Humans intuitively accompany direction-giving with gestures. These gestures have been shown to have the same underlying conceptual structure as diagrams and direction-giving language, but the puzzle is how they communicate given that their form is not codified, and may in fact differ from one person or situation to the next. Based on results from a study on language and gesture in direction-giving, we describe a framework to analyze gestural images into semantic units (image description features), and to link these units to morphological features (hand shape, trajectory, etc.). This feature-based framework allows for implementing an integrated microplanner for multimodal directions that derives the form of both natural language and gesture directly from communicative goals. Using this microplanner we developed an embodied conversational agent that can perform appropriate speech and novel gestures in direction-giving conversation with real humans. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|