Hybrid All-Reduce Strategy with Layer Overlapping for Reducing Communication Overhead in Distributed Deep Learning
KIPS Transactions on Computer and Communication Systems, Vol. 10, No. 7, pp. 191-198, Jul. 2021
10.3745/KTCCS.2021.10.7.191, PDF Download:
Keywords: Distributed Deep Learning, Synchronization, Layer Overlapping, Allreduce
Abstract
Statistics
Show / Hide Statistics
Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.
Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.
|
Cite this article
[IEEE Style]
D. Kim, S. Yeo, S. Oh, "Hybrid All-Reduce Strategy with Layer Overlapping for Reducing
Communication Overhead in Distributed Deep Learning," KIPS Transactions on Computer and Communication Systems, vol. 10, no. 7, pp. 191-198, 2021. DOI: 10.3745/KTCCS.2021.10.7.191.
[ACM Style]
Daehyun Kim, Sangho Yeo, and Sangyoon Oh. 2021. Hybrid All-Reduce Strategy with Layer Overlapping for Reducing
Communication Overhead in Distributed Deep Learning. KIPS Transactions on Computer and Communication Systems, 10, 7, (2021), 191-198. DOI: 10.3745/KTCCS.2021.10.7.191.