A versatile real-time vision-led runway localisation system for enhanced autonomy

Autor: Kyriacos Tsapparellas, Nickolay Jelev, Jonathon Waters, Aditya M. Shrikhande, Sabine Brunswicker, Lyudmila S. Mihaylova
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Frontiers in Robotics and AI, Vol 11 (2024)
Druh dokumentu: article
ISSN: 2296-9144
DOI: 10.3389/frobt.2024.1490812
Popis: This paper proposes a solution to the challenging task of autonomously landing Unmanned Aerial Vehicles (UAVs). An onboard computer vision module integrates the vision system with the ground control communication and video server connection. The vision platform performs feature extraction using the Speeded Up Robust Features (SURF), followed by fast Structured Forests edge detection and then smoothing with a Kalman filter for accurate runway sidelines prediction. A thorough evaluation is performed over real-world and simulation environments with respect to accuracy and processing time, in comparison with state-of-the-art edge detection approaches. The vision system is validated over videos with clear and difficult weather conditions, including with fog, varying lighting conditions and crosswind landing. The experiments are performed using data from the X-Plane 11 flight simulator and real flight data from the Uncrewed Low-cost TRAnsport (ULTRA) self-flying cargo UAV. The vision-led system can localise the runway sidelines with a Structured Forests approach with an accuracy approximately 84.4%, outperforming the state-of-the-art approaches and delivering real-time performance. The main contribution of this work consists of the developed vision-led system for runway detection to aid autonomous landing of UAVs using electro-optical cameras. Although implemented with the ULTRA UAV, the vision-led system is applicable to any other UAV.
Databáze: Directory of Open Access Journals