PRED18: Dataset and Further Experiments with DAVIS Event Camera in Predator-Prey Robot Chasing

Autor: Moeys, Diederik Paul, Neil, Daniel, Corradi, Federico, Kerr, Emmett, Vance, Philip, Das, Gautham, Coleman, Sonya A., McGinnity, Thomas M., Kerr, Dermot, Delbruck, Tobi
Rok vydání: 2018
Předmět:
Zdroj: IEEE EBCCSP 2018
Druh dokumentu: Working Paper
Popis: Machine vision systems using convolutional neural networks (CNNs) for robotic applications are increasingly being developed. Conventional vision CNNs are driven by camera frames at constant sample rate, thus achieving a fixed latency and power consumption tradeoff. This paper describes further work on the first experiments of a closed-loop robotic system integrating a CNN together with a Dynamic and Active Pixel Vision Sensor (DAVIS) in a predator/prey scenario. The DAVIS, mounted on the predator Summit XL robot, produces frames at a fixed 15 Hz frame-rate and Dynamic Vision Sensor (DVS) histograms containing 5k ON and OFF events at a variable frame-rate ranging from 15-500 Hz depending on the robot speeds. In contrast to conventional frame-based systems, the latency and processing cost depends on the rate of change of the image. The CNN is trained offline on the 1.25h labeled dataset to recognize the position and size of the prey robot, in the field of view of the predator. During inference, combining the ten output classes of the CNN allows extracting the analog position vector of the prey relative to the predator with a mean 8.7% error in angular estimation. The system is compatible with conventional deep learning technology, but achieves a variable latency-power tradeoff that adapts automatically to the dynamics. Finally, investigations on the robustness of the algorithm, a human performance comparison and a deconvolution analysis are also explored.
Comment: 8 pages
Databáze: arXiv