Communication-Efficient Federated Learning Over-the-Air With Sparse One-Bit Quantization

Autor: Oh, Junsuk, Lee, Donghyun, Won, Dongwook, Noh, Wonjong, Cho, Sungrae
Zdroj: IEEE Transactions on Wireless Communications; October 2024, Vol. 23 Issue: 10 p15673-15689, 17p
Abstrakt: Federated learning (FL) is a framework for realizing distributed machine learning in an environment where training samples are distributed to each device. Recently, FL has employed over-the-air computation enabling all devices to transmit learning model updates simultaneously. This work proposes a communication-efficient sparse one-bit analog aggregation (SOBAA) method, incorporating new power control, layer-wise scaled one-bit quantization, layer-wise sparsification, and an error-feedback mechanism. We derive a tight upper bound of the expected convergence rate of the proposed SOBAA as a closed-form expression. From this expression, we explicitly identify the relationship between the convergence rate and compression and aggregation errors. Based on the theoretical convergence analysis, we formulate a joint optimization problem of the compression ratio and power control to minimize compression and aggregation errors, leading to the fastest convergence. In each communication round, the optimization problem is decomposed, and solved in a computationally efficient and feasible way. From this solution, we characterize the trade-off between learning performance and communication cost. Through extensive experiments on well-known MNIST and CIFAR-10 datasets, we confirm that the proposed method provides an enhanced trade-off performance between test accuracy and communication costs and a faster convergence rate than the other state-of-the-art methods. In addition, it is proven that the proposed method is more effective for more complex datasets and learning models.
Databáze: Supplemental Index