Autor: |
Takayuki Shinohara, Haoyi Xiu, Masashi Matsuoka |
Jazyk: |
angličtina |
Rok vydání: |
2021 |
Předmět: |
|
Zdroj: |
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 14, Pp 11630-11642 (2021) |
Druh dokumentu: |
article |
ISSN: |
2151-1535 |
DOI: |
10.1109/JSTARS.2021.3124610 |
Popis: |
Since 2017, many deep learning methods for 3-D point clouds observed by airborne LiDAR (airborne 3-D point clouds) have been proposed. Moreover, not only a deep learning method for airborne 3-D point clouds but also a deep learning method for points and their waveforms observed by full-waveform LiDAR (airborne FW data) was proposed. We need to achieve highly accurate land cover classification by using airborne FW data, but open data often only have airborne 3-D point clouds available. Therefore, to improve the performance of land cover classification when using airborne 3-D point clouds published as open data, it is important to restore waveforms from airborne 3-D point clouds. In this article, we propose a deep learning model to translate an airborne 3-D point cloud to airborne FW data (called a point-to-waveform translation model, point2wave) using a conditional generative adversarial net (cGAN). Our point2wave is a cGAN pipeline consisting of a generator that translates the waveform corresponding to each point from the input airborne 3-D point cloud and discriminators that calculate the distance between the translated waveform and the ground truth waveform. Using a set of point clouds and waveforms dataset, we have experimented to translate points into the waveforms by point2wave. Experimental results showed that point2wave could translate waveforms from the airborne 3-D point cloud and the translated fake waveforms achieved nearly the same land cover classification performance as the real waveforms. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|