Identifying Light Interference in Event-Based Vision

Autor: Shi, Chenyang, Li, Yuzhen, Song, Ningfang, Wei, Boyi, Zhang, Yibo, Li, Wenzhuo, Jin, Jing
Zdroj: IEEE Transactions on Circuits and Systems for Video Technology; 2024, Vol. 34 Issue: 6 p4800-4816, 17p
Abstrakt: Light interference negatively impacts on frame-based visual tasks. Phenomena such as overexposure cause the loss of valuable information and reduce task execution efficiency. Event cameras are neuromorphic vision sensors that output sparse, asynchronous streams of events rather than frames. These cameras feature high temporal resolution, high dynamic range, and low power consumption. As a result, they are not susceptible to overexposure and motion blur, and they are able to recognize light interference such as strobe lights, stray lights, and reflections. However, event cameras are highly sensitive to light intensity changes, so light interference still affects event cameras as noise which easily alias with events triggered by environmental objects. Therefore, to reduce or eliminate the negative impact of light interference on event cameras, we systematically analyze the optical properties and event-triggering principles of these forms of light interference, and then propose ELIR (Event-based Light Interference Removal) method for removing light interference signals in event streams under static and dynamic scenes. The proposed method is validated in object detection tasks. Additionally, we launch the LIED datasets to evaluate the effect of light interference removal in event streams to assist with other studies in this field. Experimental results on the LIED datasets show that our proposed method can remove, on average, over 97% of light interference in static scenes, over 86% in dynamic scenes. Finally, the proposed method is verified on the object detection task, achieving an average PRE over 92%. The dataset is available at https://github.com/shicy17/LIED.
Databáze: Supplemental Index