Efficient multi-modal high-precision semantic segmentation from MLS point cloud without 3D annotation

Autor: Yuan Wang, Pei Sun, Wenbo Chu, Yuhao Li, Yiping Chen, Hui Lin, Zhen Dong, Bisheng Yang, Chao He
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: International Journal of Applied Earth Observations and Geoinformation, Vol 135, Iss , Pp 104243- (2024)
Druh dokumentu: article
ISSN: 1569-8432
DOI: 10.1016/j.jag.2024.104243
Popis: Quick and high-precision semantic segmentation from Mobile Laser Scanning (MLS) point clouds faces huge challenges such as large amounts of data, occlusion in complex scenes, and the high annotation cost associated with 3D point clouds. To tackle these challenges, this paper proposes a novel efficient and high-precision semantic segmentation method Mapping Considering Semantic Segmentation (MCSS) for MLS point clouds by leveraging the 2D-3D mapping relationship, which is not only without the need for labeling 3D samples but also complements missing information using multimodal data. According to the results of semantic segmentation on panoramic images by a neural network, a multi-frame mapping strategy and a local spatial similarity optimization method are proposed to project the panoramic image semantic predictions onto point clouds, thereby establishing coarse semantic information in the 3D domain. Then, a hierarchical geometric constraint model (HGCM) is designed to refine high-precision point cloud semantic segmentation. Comprehensive experimental evaluations demonstrate the effect and efficiency of our method in segmenting challenging large-scale MLS two datasets, achieving improvement by 16.8 % and 16.3 % compared with SPT. Furthermore, the proposed method takes an average of 8 s to process 1 million points and does not require annotation and training, surpassing previous methods in terms of efficiency.
Databáze: Directory of Open Access Journals