xCCL: A Survey of Industry-Led Collective Communication Libraries for Deep Learning

Journal of Computer Science and Technology, 2023  (Invited Paper for the Special Issue in Honor of Professor Kai Hwang’s 80th Birthday)

Adam Weingram, Yuke Li, Hao Qi, Darren Ng, Liuyao Dai, Xiaoyi Lu

Abstract

Machine learning techniques have become ubiquitous both in industry and academic applications. Increasing model sizes and training data volumes necessitate fast and efficient distributed training approaches. Collective communications greatly simplify inter- and intra-node data transfer and are an essential part of the distributed training process as information such as gradients must be shared between processing nodes. In this paper, we survey the current state-of-the-art collective communication libraries (namely xCCL, including NCCL, oneCCL, RCCL, MSCCL, ACCL, and Gloo), with a focus on the industry-led ones for deep learning workloads. We investigate the design features of these xCCLs, discuss their use cases in the industry deep learning workloads, compare their performance with industry-made benchmarks (i.e., NCCL Tests and PARAM), and discuss key take-aways and interesting observations. We believe our survey sheds light on potential research directions of future designs for xCCLs.

Full text links

External link

Journal Article

Publisher
Journal of Computer Science and Technology
Journal
Journal of Computer Science and Technology
Volume
38
Number
1
Eid
166
Numpages
29
Pages
166
Doi
10.1007/s11390-023-2894-6
Note
Invited Paper for the Special Issue in Honor of Professor Kai Hwang’s 80th Birthday
Series
JCST '23

Cite

Plain text

BibTeX