Autor: |
Salehi, Batool, Roy, Debashri, Gu, Jerry, Dick, Chris, Chowdhury, Kaushik |
Zdroj: |
IEEE Transactions on Mobile Computing; December 2024, Vol. 23 Issue: 12 p11655-11669, 15p |
Abstrakt: |
Fast sector-steering in the mmWave band for vehicular mobility scenarios is a challenge because standard-defined exhaustive search over predefined antenna sectors cannot be assuredly completed within short contact times. This paper proposes machine learning to speed up sector selection using data from multiple non-RF sensors, such as LiDAR, GPS, and camera images in the mmWave radios with large codebooks. The contributions in this paper are threefold: First, we propose a multimodal deep learning architecture that fuses the inputs from these data sources and locally predicts the sectors for best alignment at a vehicle. Second, we propose FLASH-and-Prune, which combines the knowledge from multiple vehicles by aggregating the local model parameters and exploits model pruning to optimize the model parameter exchange overhead. Third, we present a pruning strategy that takes into account the distributed nature of federated learning to adaptively prune or retrieve model weights. We validate the proposed architecture on a real-world multimodal dataset collected from an autonomous car. We observe that FLASH-and-Prune incurs 29.25% and 35.89% less overhead in the uplink and downlink, respectively, compared to standard federated learning. |
Databáze: |
Supplemental Index |
Externí odkaz: |
|