Temporal Predictive Coding for Gradient Compression in Distributed Learning

Autor: Edin, Adrian, Chen, Zheng, Kieffer, Michel, Johansson, Mikael
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: This paper proposes a prediction-based gradient compression method for distributed learning with event-triggered communication. Our goal is to reduce the amount of information transmitted from the distributed agents to the parameter server by exploiting temporal correlation in the local gradients. We use a linear predictor that \textit{combines past gradients to form a prediction of the current gradient}, with coefficients that are optimized by solving a least-square problem. In each iteration, every agent transmits the predictor coefficients to the server such that the predicted local gradient can be computed. The difference between the true local gradient and the predicted one, termed the \textit{prediction residual, is only transmitted when its norm is above some threshold.} When this additional communication step is omitted, the server uses the prediction as the estimated gradient. This proposed design shows notable performance gains compared to existing methods in the literature, achieving convergence with reduced communication costs.
Comment: 8 pages, 3 figures, presented at the 60th Allerton conference on Communication, Control, and Computing
Databáze: arXiv