Traffic Sign Detection Based on the Improved YOLOv5

Autor: Rongyun Zhang, Kunming Zheng, Peicheng Shi, Ye Mei, Haoran Li, Tian Qiu
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Applied Sciences, Vol 13, Iss 17, p 9748 (2023)
Druh dokumentu: article
ISSN: 13179748
2076-3417
DOI: 10.3390/app13179748
Popis: With the advancement of intelligent driving technology, researchers are paying more and more attention to the identification of traffic signs. Although a detection method of traffic signs based on color or shape can achieve recognition of large categories of signs such as prohibitions and warnings, the recognition categories are few, and the accuracy is not high. A traffic sign detection algorithm based on color or shape is small in computation and good in real-time, but the color features are greatly affected by light and weather. For the questions raised above, this paper puts forward an improved YOLOv5 method. The method uses the SIoU loss function to take the place of the loss function in the YOLOv5 model, which optimizes the training model, and the convolutional block attention model (CBAM) is fused with the CSP1_3 model in YOLOv5 to form a new CSP1_3CBAM model, which enhances YOLOv5’s feature extraction ability and improves the accuracy regarding traffic signs. In addition, the ACONC is introduced as the activation function of YOLOv5, which promotes YOLOv5’s generalization ability through adaptive selection of activation by linear–nonlinear switching factors. The research results on the TT100k dataset show that the improved YOLOv5 precision rate increased from 73.2% to 81.9%, an increase of 8.7%; the recall rate increased from 74.2% to 77.2%, an increase of 3.0%; and the mAP increased from 75.7% to 81.9%, an increase of 6.2%. The FPS also increased from 26.88 to 30.42 frames per second. The same training was carried out on the GTSDB traffic sign dataset, and the mAP increased from 90.2% to 92.5%, which indicates that the algorithm has good generalization ability.
Databáze: Directory of Open Access Journals