Low Latency Event-Based Filtering and Feature Extraction for Dynamic Vision Sensors in Real-Time FPGA Applications

Autor: Diederik Paul Moeys, Fernando Perez-Peña, F. Gomez-Rodriguez, Gabriel Jimenez-Moreno, Tobi Delbruck, Shih-Chii Liu, Alejandro Linares-Barranco
Přispěvatelé: University of Zurich, Linares-Barranco, Alejandro, Ingeniería en Automática, Electrónica, Arquitectura y Redes de Computadores, Universidad de Sevilla. Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla. TEP-108: Robótica y Tecnología de Computadores Aplicada a la Rehabilitación
Jazyk: angličtina
Rok vydání: 2019
Předmět:
General Computer Science
Address-event-representation
Computer science
Field programmable gate arrays (FPGA)
framefree vision
Feature extraction
Real-time computing
address-event-representation (AER)
frame-free vision
Event-based filters
VHDL
eld programmable gate arrays (FPGA)
dynamic vision
General Materials Science
Framefree vision
Neuromorphic engineering
Address-event-representation (AER)
Dynamic vision
Event-based processing
1700 General Computer Science
event-based lters
Field-programmable gate array
computer.programming_language
10194 Institute of Neuroinformatics
event-based filters
Pixel
General Engineering
Filter (signal processing)
event-based processing
2500 General Materials Science
Stereopsis
Asynchronous communication
Video tracking
2200 General Engineering
570 Life sciences
biology
lcsh:Electrical engineering. Electronics. Nuclear engineering
lcsh:TK1-9971
computer
Zdroj: IEEE Access, 7
IEEE Access ( Volume: 7 ) Page(s): 134926-134942
RODIN. Repositorio de Objetos de Docencia e Investigación de la Universidad de Cádiz
instname
idUS. Depósito de Investigación de la Universidad de Sevilla
IEEE Access, Vol 7, Pp 134926-134942 (2019)
ISSN: 2169-3536
Popis: Dynamic Vision Sensor (DVS) pixels produce an asynchronous variable-rate address-event output that represents brightness changes at the pixel. Since these sensors produce frame-free output, they are ideal for real-time dynamic vision applications with real-time latency and power system constraints. Event-based filtering algorithms have been proposed to post-process the asynchronous event output to reduce sensor noise, extract low level features, and track objects, among others. These postprocessing algorithms help to increase the performance and accuracy of further processing for tasks such as classification using spike-based learning (ie. ConvNets), stereo vision, and visually-servoed robots, etc. This paper presents an FPGA-based library of these postprocessing event-based algorithms with implementation details; specifically background activity (noise) filtering, pixel masking, object motion detection and object tracking. The latencies of these filters on the Field Programmable Gate Array (FPGA) platform are below 300ns with an average latency reduction of 188% (maximum of 570%) over the software versions running on a desktop PC CPU. This open-source event-based filter IP library for FPGA has been tested on two different platforms and scenarios using different synthesis and implementation tools for Lattice and Xilinx vendors.
IEEE Access, 7
ISSN:2169-3536
Databáze: OpenAIRE