A neuromorphic dataset for tabletop object segmentation in indoor cluttered environment.
Autor: | Huang X; Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi, UAE.; Khalifa University Center for Autonomous Robotic Systems (KUCARS), Khalifa University, Abu Dhabi, UAE., Kachole S; School of Computer Science and Mathematics, Kingston University, London, UK., Ayyad A; Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi, UAE., Naeini FB; School of Computer Science and Mathematics, Kingston University, London, UK., Makris D; School of Computer Science and Mathematics, Kingston University, London, UK., Zweiri Y; Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi, UAE. yahya.zweiri@ku.ac.ae.; Department of Aerospace Engineering, Khalifa University, Abu Dhabi, UAE. yahya.zweiri@ku.ac.ae. |
---|---|
Jazyk: | angličtina |
Zdroj: | Scientific data [Sci Data] 2024 Jan 25; Vol. 11 (1), pp. 127. Date of Electronic Publication: 2024 Jan 25. |
DOI: | 10.1038/s41597-024-02920-1 |
Abstrakt: | Event-based cameras are commonly leveraged to mitigate issues such as motion blur, low dynamic range, and limited time sampling, which plague conventional cameras. However, a lack of dedicated event-based datasets for benchmarking segmentation algorithms, especially those offering critical depth information for occluded scenes, has been observed. In response, this paper introduces a novel Event-based Segmentation Dataset (ESD), a high-quality event 3D spatial-temporal dataset designed for indoor object segmentation within cluttered environments. ESD encompasses 145 sequences featuring 14,166 manually annotated RGB frames, along with a substantial event count of 21.88 million and 20.80 million events from two stereo-configured event-based cameras. Notably, this densely annotated 3D spatial-temporal event-based segmentation benchmark for tabletop objects represents a pioneering initiative, providing event-wise depth, and annotated instance labels, in addition to corresponding RGBD frames. By releasing ESD, our aim is to offer the research community a challenging segmentation benchmark of exceptional quality. (© 2024. The Author(s).) |
Databáze: | MEDLINE |
Externí odkaz: |