An Inverse Node Graph-Based Method for the Urban Scene Segmentation of 3D Point Clouds
Autor: | Bufan Zhao, Xianghong Hua, Kegen Yu, Xiaoxing He, Weixing Xue, Qiqi Li, Hanwen Qi, Lujie Zou, Cheng Li |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: | |
Zdroj: | Remote Sensing, Vol 13, Iss 15, p 3021 (2021) |
Druh dokumentu: | article |
ISSN: | 13153021 2072-4292 |
DOI: | 10.3390/rs13153021 |
Popis: | Urban object segmentation and classification tasks are critical data processing steps in scene understanding, intelligent vehicles and 3D high-precision maps. Semantic segmentation of 3D point clouds is the foundational step in object recognition. To identify the intersecting objects and improve the accuracy of classification, this paper proposes a segment-based classification method for 3D point clouds. This method firstly divides points into multi-scale supervoxels and groups them by proposed inverse node graph (IN-Graph) construction, which does not need to define prior information about the node, it divides supervoxels by judging the connection state of edges between them. This method reaches minimum global energy by graph cutting, obtains the structural segments as completely as possible, and retains boundaries at the same time. Then, the random forest classifier is utilized for supervised classification. To deal with the mislabeling of scattered fragments, higher-order CRF with small-label cluster optimization is proposed to refine the classification results. Experiments were carried out on mobile laser scan (MLS) point dataset and terrestrial laser scan (TLS) points dataset, and the results show that overall accuracies of 97.57% and 96.39% were obtained in the two datasets. The boundaries of objects were retained well, and the method achieved a good result in the classification of cars and motorcycles. More experimental analyses have verified the advantages of the proposed method and proved the practicability and versatility of the method. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: | |
Nepřihlášeným uživatelům se plný text nezobrazuje | K zobrazení výsledku je třeba se přihlásit. |