Autor: |
FARZAM TAJDARI, TOON HUYSMANS, XINHE YAO, JUN XU, MARYAM ZEBARJADI, YU SONG |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
IEEE Open Journal of the Computer Society, Vol 5, Pp 343-355 (2024) |
Druh dokumentu: |
article |
ISSN: |
2644-1268 |
DOI: |
10.1109/OJCS.2024.3406645 |
Popis: |
4D-scans of dynamic deformable human body parts help researchers have a better understanding of spatiotemporal features. However, reconstructing 4D-scans utilizing multiple asynchronous cameras encounters two main challenges: 1) finding dynamic correspondences among different frames captured by each camera at the timestamps of the camera in terms of dynamic feature recognition, and 2) reconstructing 3D-shapes from the combined point clouds captured by different cameras at asynchronous timestamps in terms of multi-view fusion. Here, we introduce a generic framework able to 1) find and align dynamic features in the 3D-scans captured by each camera using the nonrigid-iterative-closest-farthest-points algorithm; 2) synchronize scans captured by asynchronous cameras through a novel ADGC-LSTM-based-network capable of aligning 3D-scans captured by different cameras to the timeline of a specific camera; and 3) register a high-quality template to synchronized scans at each timestamp to form a high-quality 3D-mesh model using a non-rigid registration method. With a newly developed 4D-foot-scanner, we validate the framework and create the first open-access data-set, namely the 4D-feet. It includes 4D-shapes (15 fps) of the right and left feet of 58 participants (116 feet including 5147 3D-frames), covering significant phases of the gait cycle. The results demonstrate the effectiveness of the proposed framework, especially in synchronizing asynchronous 4D-scans. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|