Straggler mitigation through unequal error protection for distributed approximate matrix multiplication
Date
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
Source Title
Print ISSN
Electronic ISSN
Publisher
Volume
Issue
Pages
Language
Type
Journal Title
Journal ISSN
Volume Title
Citation Stats
Attention Stats
Usage Stats
views
downloads
Series
Abstract
Large-scale machine learning and data mining methods routinely distribute computations across multiple agents to parallelize processing. The time required for the computations at the agents is affected by the availability of local resources and/or poor channel conditions, thus giving rise to the “straggler problem.” In this paper, we address this problem for distributed approximate matrix multiplication. In particular, we employ Unequal Error Protection (UEP) codes to obtain an approximation of the matrix product to provide higher protection for the blocks with a higher effect on the multiplication outcome. We characterize the performance of the proposed approach from a theoretical perspective by bounding the expected reconstruction error for matrices with uncorrelated entries. We also apply the proposed coding strategy to the computation of the back-propagation step in the training of a Deep Neural Network (DNN) for an image classification task in the evaluation of the gradients. Our numerical experiments show that it is indeed possible to obtain significant improvements in the overall time required to achieve DNN training convergence by producing approximation of matrix products using UEP codes in the presence of stragglers.