Zobrazeno 1 - 10
of 42
pro vyhledávání: '"Wijmans, Erik"'
Not yet. We present SPACE, a benchmark that systematically evaluates spatial cognition in frontier models. Our benchmark builds on decades of research in cognitive science. It evaluates large-scale mapping abilities that are brought to bear when an o
Externí odkaz:
http://arxiv.org/abs/2410.06468
Animal navigation research posits that organisms build and maintain internal spatial representations, or maps, of their environment. We ask if machines -- specifically, artificial intelligence (AI) navigation agents -- also build implicit (or 'mental
Externí odkaz:
http://arxiv.org/abs/2301.13261
We study ObjectGoal Navigation -- where a virtual robot situated in a new environment is asked to navigate to an object. Prior work has shown that imitation learning (IL) using behavior cloning (BC) on a dataset of human demonstrations achieves promi
Externí odkaz:
http://arxiv.org/abs/2301.07302
We present Variable Experience Rollout (VER), a technique for efficiently scaling batched on-policy reinforcement learning in heterogenous environments (where different environments take vastly different times to generate rollouts) to many GPUs resid
Externí odkaz:
http://arxiv.org/abs/2210.05064
Autor:
Partsey, Ruslan, Wijmans, Erik, Yokoyama, Naoki, Dobosevych, Oles, Batra, Dhruv, Maksymets, Oleksandr
Can an autonomous agent navigate in a new environment without building an explicit map? For the task of PointGoal navigation ('Go to $\Delta x$, $\Delta y$') under idealized settings (no RGB-D and actuation noise, perfect GPS+Compass), the answer is
Externí odkaz:
http://arxiv.org/abs/2206.00997
We propose a novel architecture and training paradigm for training realistic PointGoal Navigation -- navigating to a target coordinate in an unseen environment under actuation and sensor noise without access to ground-truth localization. Specifically
Externí odkaz:
http://arxiv.org/abs/2109.08677
Autor:
Ramakrishnan, Santhosh K., Gokaslan, Aaron, Wijmans, Erik, Maksymets, Oleksandr, Clegg, Alex, Turner, John, Undersander, Eric, Galuba, Wojciech, Westbury, Andrew, Chang, Angel X., Savva, Manolis, Zhao, Yili, Batra, Dhruv
We present the Habitat-Matterport 3D (HM3D) dataset. HM3D is a large-scale dataset of 1,000 building-scale 3D reconstructions from a diverse set of real-world locations. Each scene in the dataset consists of a textured 3D mesh reconstruction of inter
Externí odkaz:
http://arxiv.org/abs/2109.08238
We present Megaverse, a new 3D simulation platform for reinforcement learning and embodied AI research. The efficient design of our engine enables physics-based simulation with high-dimensional egocentric observations at more than 1,000,000 actions p
Externí odkaz:
http://arxiv.org/abs/2107.08170
Autor:
Szot, Andrew, Clegg, Alex, Undersander, Eric, Wijmans, Erik, Zhao, Yili, Turner, John, Maestre, Noah, Mukadam, Mustafa, Chaplot, Devendra, Maksymets, Oleksandr, Gokaslan, Aaron, Vondrus, Vladimir, Dharur, Sameer, Meier, Franziska, Galuba, Wojciech, Chang, Angel, Kira, Zsolt, Koltun, Vladlen, Malik, Jitendra, Savva, Manolis, Batra, Dhruv
We introduce Habitat 2.0 (H2.0), a simulation platform for training virtual robots in interactive 3D environments and complex physics-enabled scenarios. We make comprehensive contributions to all levels of the embodied AI stack - data, simulation, an
Externí odkaz:
http://arxiv.org/abs/2106.14405
ObjectGoal Navigation (ObjectNav) is an embodied task wherein agents are to navigate to an object instance in an unseen environment. Prior works have shown that end-to-end ObjectNav agents that use vanilla visual and recurrent modules, e.g. a CNN+RNN
Externí odkaz:
http://arxiv.org/abs/2104.04112