Towards Visual Dialogue for Human-Robot Interaction
Autor: | Jose L. Part, Christian Dondrup, Yanchao Yu, Daniel Hernández García, Oliver Lemon, Nancie Gunson |
---|---|
Jazyk: | angličtina |
Předmět: |
0209 industrial biotechnology
Computer science business.industry 05 social sciences Context (language use) Robotics 02 engineering and technology Human–robot interaction 020901 industrial engineering & automation Human–computer interaction Robot 0501 psychology and cognitive sciences Artificial intelligence Architecture business 050107 human factors |
Zdroj: | Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction HRI (Companion) |
DOI: | 10.1145/3434074.3447278 |
Popis: | The goal of the EU H2020-ICT funded SPRING project is to develop a socially pertinent robot to carry out tasks in a gerontological healthcare unit. In this context, being able to perceive its environment and have coherent and relevant conversations about the surrounding world is critical. In this paper, we describe current progress towards developing the necessary integrated visual and conversational capabilities for a robot to operate in such environments. Concretely, we introduce an architecture for conversing about objects and other entities present in an environment. The work described in this paper has applications that extend well beyond healthcare and can be used on any robot that requires to interact with its visual and spatial environment in order to be able to perform its duties. |
Databáze: | OpenAIRE |
Externí odkaz: |