Autor: |
R. S. Zhuk, B. A. Zalesky, Ph. S. Trotski |
Jazyk: |
ruština |
Rok vydání: |
2020 |
Předmět: |
|
Zdroj: |
Informatika, Vol 17, Iss 2, Pp 17-24 (2020) |
Druh dokumentu: |
article |
ISSN: |
1816-0301 |
DOI: |
10.37661/1816-0301-2020-17-2-17-24 |
Popis: |
An autonomous visual navigation algorithm is considered, designed for “home“ return of unmanned aerial vehicle (UAV) equipped with on-board video camera and on-board computer, out of GPS and GLONASS navigation signals. The proposed algorithm is similar to the well-known visual navigation algorithms such as V-SLAM (simultaneous localization and mapping) and visual odometry, however, it differs in separate implementation of mapping and localization processes. It calculates the geographical coordinates of the features on the frames taken by on-board video camera during the flight from the start point until the moment of GPS and GLONASS signals loss. After the loss of the signal the return mission is launched, which provides estimation of the position of UAV relatively the map created by previously found features. Proposed approach does not require such complex calculations as V-SLAM and does not accumulate errors over time, in contrast to visual odometry and traditional methods of inertial navigation. The algorithm was implemented and tested with use of DJI Phantom 3 Pro quadcopter. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|