Helpful DoggyBot: Open-World Object Fetching using Legged Robots and Vision-Language Models

Autor: Wu, Qi, Fu, Zipeng, Cheng, Xuxin, Wang, Xiaolong, Finn, Chelsea
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Learning-based methods have achieved strong performance for quadrupedal locomotion. However, several challenges prevent quadrupeds from learning helpful indoor skills that require interaction with environments and humans: lack of end-effectors for manipulation, limited semantic understanding using only simulation data, and low traversability and reachability in indoor environments. We present a system for quadrupedal mobile manipulation in indoor environments. It uses a front-mounted gripper for object manipulation, a low-level controller trained in simulation using egocentric depth for agile skills like climbing and whole-body tilting, and pre-trained vision-language models (VLMs) with a third-person fisheye and an egocentric RGB camera for semantic understanding and command generation. We evaluate our system in two unseen environments without any real-world data collection or training. Our system can zero-shot generalize to these environments and complete tasks, like following user's commands to fetch a randomly placed stuff toy after climbing over a queen-sized bed, with a 60% success rate. Project website: https://helpful-doggybot.github.io/
Comment: Project website: https://helpful-doggybot.github.io/
Databáze: arXiv