Zobrazeno 1 - 8
of 8
pro vyhledávání: '"Attarian, Maria"'
Autor:
Jain, Vidhi, Attarian, Maria, Joshi, Nikhil J, Wahid, Ayzaan, Driess, Danny, Vuong, Quan, Sanketi, Pannag R, Sermanet, Pierre, Welker, Stefan, Chan, Christine, Gilitschenski, Igor, Bisk, Yonatan, Dwibedi, Debidatta
Large-scale multi-task robotic manipulation systems often rely on text to specify the task. In this work, we explore whether a robot can learn by observing humans. To do so, the robot must understand a person's intent and perform the inferred task de
Externí odkaz:
http://arxiv.org/abs/2403.12943
Autor:
Liang, Jacky, Xia, Fei, Yu, Wenhao, Zeng, Andy, Arenas, Montserrat Gonzalez, Attarian, Maria, Bauza, Maria, Bennice, Matthew, Bewley, Alex, Dostmohamed, Adil, Fu, Chuyuan Kelly, Gileadi, Nimrod, Giustina, Marissa, Gopalakrishnan, Keerthana, Hasenclever, Leonard, Humplik, Jan, Hsu, Jasmine, Joshi, Nikhil, Jyenis, Ben, Kew, Chase, Kirmani, Sean, Lee, Tsang-Wei Edward, Lee, Kuang-Huei, Michaely, Assaf Hurwitz, Moore, Joss, Oslund, Ken, Rao, Dushyant, Ren, Allen, Tabanpour, Baruch, Vuong, Quan, Wahid, Ayzaan, Xiao, Ted, Xu, Ying, Zhuang, Vincent, Xu, Peng, Frey, Erik, Caluwaerts, Ken, Zhang, Tingnan, Ichter, Brian, Tompson, Jonathan, Takayama, Leila, Vanhoucke, Vincent, Shafran, Izhak, Mataric, Maja, Sadigh, Dorsa, Heess, Nicolas, Rao, Kanishka, Stewart, Nik, Tan, Jie, Parada, Carolina
Large language models (LLMs) have been shown to exhibit a wide range of capabilities, such as writing robot code from language commands -- enabling non-experts to direct robot behaviors, modify them based on feedback, or compose them to perform new t
Externí odkaz:
http://arxiv.org/abs/2402.11450
Autor:
Attarian, Maria, Asif, Muhammad Adil, Liu, Jingzhou, Hari, Ruthrash, Garg, Animesh, Gilitschenski, Igor, Tompson, Jonathan
Publikováno v:
7th Annual Conference on Robot Learning, 2023
Many existing learning-based grasping approaches concentrate on a single embodiment, provide limited generalization to higher DoF end-effectors and cannot capture a diverse set of grasp modes. We tackle the problem of grasping using multiple embodime
Externí odkaz:
http://arxiv.org/abs/2312.03864
Cognitive planning is the structural decomposition of complex tasks into a sequence of future behaviors. In the computational setting, performing cognitive planning entails grounding plans and concepts in one or more modalities in order to leverage t
Externí odkaz:
http://arxiv.org/abs/2210.03825
Autor:
Zeng, Andy, Attarian, Maria, Ichter, Brian, Choromanski, Krzysztof, Wong, Adrian, Welker, Stefan, Tombari, Federico, Purohit, Aveek, Ryoo, Michael, Sindhwani, Vikas, Lee, Johnny, Vanhoucke, Vincent, Florence, Pete
Large pretrained (e.g., "foundation") models exhibit distinct capabilities depending on the domain of data they are trained on. While these domains are generic, they may only barely overlap. For example, visual-language models (VLMs) are trained on I
Externí odkaz:
http://arxiv.org/abs/2204.00598
Autor:
Zeng, Andy, Florence, Pete, Tompson, Jonathan, Welker, Stefan, Chien, Jonathan, Attarian, Maria, Armstrong, Travis, Krasin, Ivan, Duong, Dan, Wahid, Ayzaan, Sindhwani, Vikas, Lee, Johnny
Robotic manipulation can be formulated as inducing a sequence of spatial displacements: where the space being moved can encompass an object, part of an object, or end effector. In this work, we propose the Transporter Network, a simple model architec
Externí odkaz:
http://arxiv.org/abs/2010.14406
Deep-learning vision models have shown intriguing similarities and differences with respect to human vision. We investigate how to bring machine visual representations into better alignment with human representations. Human representations are often
Externí odkaz:
http://arxiv.org/abs/2010.06512
Autor:
Castro, Pablo Samuel, Attarian, Maria
The use of language models for generating lyrics and poetry has received an increased interest in the last few years. They pose a unique challenge relative to standard natural language problems, as their ultimate purpose is reative, notions of accura
Externí odkaz:
http://arxiv.org/abs/1811.04651