Autor: |
Orlov Savko, Liubove, Qian, Zhiqin, Gremillion, Gregory, Neubauer Dr., Catherine, Canady, Jonroy, Unhelkar, Vaibhav, Neubauer, Catherine |
Předmět: |
|
Zdroj: |
ACM/IEEE International Conference on Human-Robot Interaction; Mar2024, p924-928, 5p |
Abstrakt: |
To forge effective collaborations with humans, robots require the capacity to understand and predict the behaviors of their human counterparts. There is a growing body of computational research on human modeling for human-robot interaction (HRI). However, a key bottleneck in conducting this research is the relative lack of data of cognitive states -- like intent, workload, and trust -- which undeniably affect human behavior. Despite their significance, these states are elusive to measure, making the assembly of datasets a challenge and hindering the progression of human modeling techniques. To help address this, we first introduce Rescue World for Teams (RW4T): a configurable testbed to simulate disaster response scenarios requiring human-robot collaboration. Next, using RW4T, we curate a multimodal dataset of human-robot behavior and cognitive states in dyadic human-robot collaboration. This RW4T dataset includes state, action and reward sequences, and all the necessary data to replay a visual task execution. It further contains psychophysiological metrics like heart rate and pupillometry, complemented by self-reported cognitive state measures. With data from 20 participants, each undertaking five human-robot collaborative tasks, this dataset (comprising of 100 unique trajectories) accompanied with the simulator can serve as a valuable benchmark for human behavior modeling. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|