Autor: |
Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Remote Sensing, Vol 16, Iss 18, p 3462 (2024) |
Druh dokumentu: |
article |
ISSN: |
2072-4292 |
DOI: |
10.3390/rs16183462 |
Popis: |
Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|