A Multimodal Recurrent Model for Driver Distraction Detection

Autor: Marcel Ciesla, Gerald Ostermayer
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Applied Sciences, Vol 14, Iss 19, p 8935 (2024)
Druh dokumentu: article
ISSN: 2076-3417
DOI: 10.3390/app14198935
Popis: Distracted driving is a significant threat to road safety, causing numerous accidents every year. Driver distraction detection systems offer a promising solution by alerting the driver to refocus on the primary driving task. Even with increasing vehicle automation, human drivers must remain alert, especially in partially automated vehicles where they may need to take control in critical situations. In this work, an AI-based distraction detection model is developed that focuses on improving classification performance using a long short-term memory (LSTM) network. Unlike traditional approaches that evaluate individual frames independently, the LSTM network captures temporal dependencies across multiple time steps. In addition, this study investigated the integration of vehicle sensor data and an inertial measurement unit (IMU) to further improve detection accuracy. The results show that the recurrent LSTM network significantly improved the average F1 score from 71.3% to 87.0% compared to a traditional vision-based approach using a single image convolutional neural network (CNN). Incorporating sensor data further increased the score to 90.1%. These results highlight the benefits of integrating temporal dependencies and multimodal inputs and demonstrate the potential for more effective driver distraction detection systems that can improve road safety.
Databáze: Directory of Open Access Journals