Coalmine image matching by fusing multi-level feature enhancement and weighted grid statistics
Autor: | Heping LI, Shaoqiu HONG, Jian CHENG, Ning AN, Hailong ZHAO, Haixin XIU |
---|---|
Jazyk: | čínština |
Rok vydání: | 2024 |
Předmět: | |
Zdroj: | Meitan kexue jishu, Vol 52, Iss 11, Pp 129-140 (2024) |
Druh dokumentu: | article |
ISSN: | 0253-2336 2024-1224 |
DOI: | 10.12438/cst.2024-1224 |
Popis: | Image feature extraction and matching is a critical technique for video and image stitching, visual localization and navigation in underground coal mines. However, underground tunnelling environmental factors, such as low light, uneven illumination, and repetitive textures, result in low-contrast images with indistinct texture information, making feature point extraction difficult and increasing the rate of mismatches. To address these issues, an coalmine image matching method based on multi-level feature fusion enhancement and weighted grid statistics is proposed. Firstly, multi-level feature detection network based on deformable convolution layers and stacked network is adopted for ensuring the rotational invariance of features. Secondly, both the extracted feature points and descriptors are encoded and projected into a high-dimensional space using a feature enhancement module, and the Transformer is then used to enhance the distinguishability of the features. Finally, to address the perceptual confusion problem caused by repetitive textures in underground scenes, a multi-stage matching optimization strategy based on weighted grids is used. This strategy can combine matching quality factors and motion smoothness constraints to filter the mismatches. A lot of experiments were conducted on an underground actual dataset, as well as on two public datasets including LOL and HPatches. The experimental results show that the proposed feature extraction and matching method has higher accuracy and robustness. Specifically, compared with the ORB, SIFT, ASLFeat and Superpoint algorithms, the proposed feature extraction method achieved average accuracy improvements of 33.07%, 69.78%, 17.65% and 33.52% respectively. Compared with the feature matching methods including FLANN, BF+KNN, BF+RANSAC and BF+GMS, the proposed feature matching method achieved average accuracy improvements of 19.66%, 23.26%, 4.16% and 18.46%. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: |