Journal article
M22: A Communication-Efficient Algorithm for Federated Learning Inspired by Rate-Distortion
Abstract
In federated learning (FL), the communication constraint between the remote clients and the Parameter Server (PS) is a crucial bottleneck. For this reason, model updates must be compressed so as to minimize the loss in accuracy resulting from the communication constraint. This paper proposes “M-magnitude weighted L2 distortion + 2 degrees of freedom” (M22) algorithm, a rate-distortion inspired approach to gradient compression for federated …
Authors
Liu Y; Rini S; Salehkalaibar S; Chen J
Journal
IEEE Transactions on Communications, Vol. 72, No. 2, pp. 845–860
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Publication Date
February 1, 2024
DOI
10.1109/tcomm.2023.3327778
ISSN
0090-6778