Learning robust, real-time, reactive robotic grasping
Autor: | Jürgen Leitner, Peter Corke, Douglas Morrison |
---|---|
Rok vydání: | 2019 |
Předmět: |
TheoryofComputation_MISCELLANEOUS
0209 industrial biotechnology Pixel Computer science business.industry Applied Mathematics Mechanical Engineering GRASP 02 engineering and technology Convolutional neural network 020901 industrial engineering & automation Artificial Intelligence Modeling and Simulation 0202 electrical engineering electronic engineering information engineering Deep neural networks 020201 artificial intelligence & image processing Computer vision Artificial intelligence Electrical and Electronic Engineering business Software Generative grammar |
Zdroj: | The International Journal of Robotics Research. 39:183-201 |
ISSN: | 1741-3176 0278-3649 |
Popis: | We present a novel approach to perform object-independent grasp synthesis from depth images via deep neural networks. Our generative grasping convolutional neural network (GG-CNN) predicts a pixel-wise grasp quality that can be deployed in closed-loop grasping scenarios. GG-CNN overcomes shortcomings in existing techniques, namely discrete sampling of grasp candidates and long computation times. The network is orders of magnitude smaller than other state-of-the-art approaches while achieving better performance, particularly in clutter. We run a suite of real-world tests, during which we achieve an 84% grasp success rate on a set of previously unseen objects with adversarial geometry and 94% on household items. The lightweight nature enables closed-loop control of up to 50 Hz, with which we observed 88% grasp success on a set of household objects that are moved during the grasp attempt. We further propose a method combining our GG-CNN with a multi-view approach, which improves overall grasp success rate in clutter by 10%. Code is provided at https://github.com/dougsm/ggcnn |
Databáze: | OpenAIRE |
Externí odkaz: |