File(s) under permanent embargo
Q-GADMM: Quantized group ADMM for communication efficient decentralized machine learning
conference contribution
posted on 2020-05-14, 00:00 authored by A Elgabli, Jihong ParkJihong Park, A S Bedi, M Bennis, V AggarwalIn this paper, we propose a communication-efficient decen-tralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM). Every worker in Q-GADMM communicates only with two neighbors, and updates its model via the group alternating direct method of multiplier (GADMM), thereby ensuring fast convergence while reducing the number of communication rounds. Furthermore, each worker quantizes its model updates before transmissions, thereby decreasing the communication payload sizes. We prove that Q-GADMM converges to the optimal solution for convex loss functions, and numerically show that Q-GADMM yields 7x less communication cost while achieving almost the same accuracy and convergence speed compared to GADMM without quantization.
History
Event
ICASSP - IEEE Acoustics, Speech and Signal Processing. International Conference (2020 : Barcelona, Spain)Volume
2020-MayPagination
8876 - 8880Publisher
Institute of Electrical and Electronics Engineers (IEEE)Location
Online : Barcelona, SpainPlace of publication
Piscataway, N.J.Publisher DOI
Start date
2020-05-04End date
2020-05-08ISSN
1520-6149ISBN-13
9781509066315Language
engPublication classification
E1.1 Full written paper - refereedCopyright notice
2020, IEEETitle of proceedings
ICASSP 2020 : Proceedings of IEEE International Conference on Acoustics, Speech and Signal ProcessingUsage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC