Vision-Based Real-Time Obstacle Segmentation Algorithm for Autonomous Surface Vehicle
Autor: | Yonggil Jo, Byeolteo Park, Donghwa Lee, Donghoon Kim, Jungmo Koo, Hanguen Kim, Hyun Myung |
---|---|
Rok vydání: | 2019 |
Předmět: |
Surface (mathematics)
General Computer Science Computer science 02 engineering and technology 01 natural sciences computer vision law.invention Intersection law General Materials Science Segmentation Computer vision Autonomous surface vehicle Radar Class (computer programming) ship navigation business.industry 010401 analytical chemistry General Engineering deep learning obstacle segmentation 021001 nanoscience & nanotechnology Frame rate 0104 chemical sciences Obstacle lcsh:Electrical engineering. Electronics. Nuclear engineering Artificial intelligence 0210 nano-technology business lcsh:TK1-9971 |
Zdroj: | IEEE Access, Vol 7, Pp 179420-179428 (2019) |
ISSN: | 2169-3536 |
Popis: | Among various sensors used to recognize obstacles in marine environments, vision sensors are the most basic. Vision sensors are significantly affected by the surrounding environment and cannot recognize distant objects. However, despite these drawbacks, they can detect objects that radars cannot detect in nearby regions. They can also recognize small obstacles such as boats that are not equipped with an automatic identification system (AIS) or buoys. Thus, vision sensors and radar can be used in a complementary manner. This paper proposes a vision sensor-based model, called Skip-ENet, for recognizing obstacles in real time. Compared with ENet, the amount of computation is not significantly higher. Further, Skip-ENet can segment complex marine obstacles effectively by increasing the values for the class accuracy and mean Intersection of Union (mIoU). Moreover, this model enables even low-cost embedded systems to compute 10 or more frames per second (fps). The superiority of the proposed model was verified by comparing its performance with that of the conventional segmentation models, MobileNet, ENet, and DeeplabV3+. |
Databáze: | OpenAIRE |
Externí odkaz: |