Zobrazeno 1 - 10
of 8 802
pro vyhledávání: '"robot manipulation"'
Recent developments in Large Language Models pre-trained on extensive corpora have shown significant success in various natural language processing tasks with minimal fine-tuning. This success offers new promise for robotics, which has long been cons
Externí odkaz:
http://arxiv.org/abs/2412.04445
Recent advancements utilizing large-scale video data for learning video generation models demonstrate significant potential in understanding complex physical dynamics. It suggests the feasibility of leveraging diverse robot trajectory data to develop
Externí odkaz:
http://arxiv.org/abs/2411.09153
Autor:
Wu, Kun, Hou, Chengkai, Liu, Jiaming, Che, Zhengping, Ju, Xiaozhu, Yang, Zhuqin, Li, Meng, Zhao, Yinuo, Xu, Zhiyuan, Yang, Guang, Zhao, Zhen, Li, Guangyu, Jin, Zhao, Wang, Lecheng, Mao, Jilei, Wang, Xinhua, Fan, Shichao, Liu, Ning, Ren, Pei, Zhang, Qiang, Lyu, Yaoxu, Liu, Mengzhen, He, Jingyang, Luo, Yulin, Gao, Zeyu, Li, Chenxuan, Gu, Chenyang, Fu, Yankai, Wu, Di, Wang, Xingyu, Chen, Sixiang, Wang, Zhenyu, An, Pengju, Qian, Siyuan, Zhang, Shanghang, Tang, Jian
Developing robust and general-purpose robotic manipulation policies is a key goal in the field of robotics. To achieve effective generalization, it is essential to construct comprehensive datasets that encompass a large number of demonstration trajec
Externí odkaz:
http://arxiv.org/abs/2412.13877
Teleoperation for robot imitation learning is bottlenecked by hardware availability. Can high-quality robot data be collected without a physical robot? We present a system for augmenting Apple Vision Pro with real-time virtual robot feedback. By prov
Externí odkaz:
http://arxiv.org/abs/2412.10631
Robots can acquire complex manipulation skills by learning policies from expert demonstrations, which is often known as vision-based imitation learning. Generating policies based on diffusion and flow matching models has been shown to be effective, p
Externí odkaz:
http://arxiv.org/abs/2412.04987
Deploying robots in open-world environments involves complex tasks characterized by long sequences and rich interactions, necessitating efficient transfer of robotic skills across diverse and complex scenarios. To address this challenge, we propose a
Externí odkaz:
http://arxiv.org/abs/2411.11714
Autor:
Nasiriany, Soroush, Kirmani, Sean, Ding, Tianli, Smith, Laura, Zhu, Yuke, Driess, Danny, Sadigh, Dorsa, Xiao, Ted
We explore how intermediate policy representations can facilitate generalization by providing guidance on how to perform manipulation tasks. Existing representations such as language, goal images, and trajectory sketches have been shown to be helpful
Externí odkaz:
http://arxiv.org/abs/2411.02704
In this paper, we perform robot manipulation activities in real-world environments with language contexts by integrating a compact referring image segmentation model into the robot's perception module. First, we propose CLIPU$^2$Net, a lightweight re
Externí odkaz:
http://arxiv.org/abs/2409.11518
Autor:
Zhang, Fan, Gienger, Michael
We present a framework for assistive robot manipulation, which focuses on two fundamental challenges: first, efficiently adapting large-scale models to downstream scene affordance understanding tasks, especially in daily living scenarios where gather
Externí odkaz:
http://arxiv.org/abs/2409.01083