Cross Subspace Alignment Codes for Coded Distributed Batch Computation

09/30/2019
by   Zhuqing Jia, et al.
0

Coded distributed batch computation distributes a computation task, such as matrix multiplication, N-linear computation, or multivariate polynomial evaluation, across S servers through a coding scheme, such that the response from any R servers (R is called the recovery threshold) is sufficient for the user to recover the desired computed value. Current approaches are based on either exclusively matrix-partitioning (Entangled Polynomial (EP) Codes for matrix multiplication), or exclusively batch processing (Lagrange Coded Computing (LCC)). We present three related classes of codes, based on the idea of Cross-Subspace Alignment (CSA) which was introduced originally in the context of private information retrieval. CSA codes are characterized by a Cauchy-Vandermonde matrix structure that facilitates interference alignment along Vandermonde terms, while the desired computations remain resolvable along the Cauchy terms. These codes unify, generalize and improve upon the state-of-art codes for distributed computing. First we introduce CSA codes for matrix multiplication, which yield LCC codes as a special case, and are shown to outperform LCC codes in general over strictly download-limited settings. Next, we introduce Generalized CSA (GCSA) codes for matrix multiplication that bridge the extremes of matrix-partitioning and batch processing approaches. Finally, we introduce N-CSA codes for N-linear distributed batch computations and multivariate batch polynomial evaluations. N-CSA codes include LCC codes as a special case, and are in general capable of achieving significantly lower downloads than LCC codes due to cross-subspace alignment. Generalizations of N-CSA codes to include X-secure data and B-byzantine servers are also obtained.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro