Gradient Compression in Distributed Deep Learning
Introduction to distributed deep learning, and in particular gradient compression: a technique used to reduce communication overhead between machines when training a large deep learning model.
Introduction to distributed deep learning, and in particular gradient compression: a technique used to reduce communication overhead between machines when training a large deep learning model.