Current Location:home > Detailed Browse

# DEED: A general quantization scheme for saving bits in communication

Submit Time: 2020-06-16
Author:
Institute: 1.Tsinghua University; 2.University of Illinois Urbana-Champaign;

## Abstracts

 Quantization is a popular technique to reduce communication in distributed optimization. Motivated by the classical work on inexact gradient descent (GD) \cite{bertsekas2000gradient}, we provide a general convergence analysis framework for inexact GD that is tailored for quantization schemes. We also propose a quantization scheme Double Encoding and Error Diminishing (DEED). DEED can achieve small communication complexity in three settings: frequent-communication large-memory, frequent-communication small-memory, and infrequent-communication (e.g. federated learning). More specifically, in the frequent-communication large-memory setting, DEED can be easily combined with Nesterov's method, so that the total number of bits required is $\tilde{O}( \sqrt{\kappa} \log 1/\epsilon )$, where $\tilde{O}$ hides numerical constant and $\log \kappa$ factors. In the frequent-communication small-memory setting, DEED combined with SGD only requires $\tilde{O}( \kappa \log 1/\epsilon)$ number of bits in the interpolation regime. In the infrequent communication setting, DEED combined with Federated averaging requires a smaller total number of bits than Federated Averaging. All these algorithms converge at the same rate as their non-quantized versions, while using a smaller number of bits.
From: 叶添
DOI：10.12074/202005.00043
Recommended references： Tian Ye,Peijun Xiao,Ruoyu Sun.(2020).DEED: A general quantization scheme for saving bits in communication.[ChinaXiv:202005.00043] (Click&Copy)
Version History
Related Paper