Autor: |
Kao, Youchen, Che, Shengbing, Zhou, Sha, Guo, Shenyi, Zhang, Xu, Wang, Wanqin |
Předmět: |
|
Zdroj: |
Scientific Reports; 7/16/2024, Vol. 14 Issue 1, p1-17, 17p |
Abstrakt: |
Lane line images have the essential attribute of large-scale variation and complex scene information, and the similarity between adjacent lane lines is high, which can easily cause classification errors. And remote lane lines are difficult to recognize due to visual angle changes in width. To address this issue, this paper proposes an effective lane detection framework, which is a hybrid feature fusion network that enhances multiple spatial features and distinguishes key features throughout the entire lane line segment. It enhances and fuses lane line features at multiscale to enhance the feature representation of lane line images, especially at the far end. Firstly, in order to enhance the correlation of multiscale lane features, a multi-head self attention is used to construct a multi-space attention enhancement module for feature enhancement in multispace. Secondly, a spatial separable convolutional branch is designed for the jumping layer structure connecting multiscale lane line features. While retaining feature information of different scales, important lane areas in multiscale feature information are emphasized through the allocation of spatial attention weights. Finally, considering that lane lines are elongated areas in the image, and the background information in the image is much more abundant than lane line information, the flexibility of traditional pooling operations in capturing widely existing anisotropic contexts in actual environments is limited. Therefore, before embedding feature output branches, strip pooling is introduced to refine the representation of lane line information and optimize model performance. The experimental results show that the accuracy on the TuSimple dataset reaches 96.84%, and the F1 score on the CULane dataset reaches 75.9%. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|