A near-optimal direct-sum theorem for communication complexity

08/17/2020
by   Rahul Jain, et al.
0

We show a near optimal direct-sum theorem for the two-party randomized communication complexity. Let f⊆ X × Y× Z be a relation, ε> 0 and k be an integer. We show, R^pub_ε(f^k) ·log(R^pub_ε(f^k)) ≥Ω(k ·R^pub_ε(f)) , where f^k= f ×…× f (k-times) and R^pub_ε(·) represents the public-coin randomized communication complexity with worst-case error ε. Given a protocol 𝒫 for f^k with communication cost c · k and worst-case error ε, we exhibit a protocol 𝒬 for f with external-information-cost O(c) and worst-error ε. We then use a message compression protocol due to Barak, Braverman, Chen and Rao [2013] for simulating 𝒬 with communication O(c ·log(c· k)) to arrive at our result. To show this reduction we show some new chain-rules for capacity, the maximum information that can be transmitted by a communication channel. We use the powerful concept of Nash-Equilibrium in game-theory, and its existence in suitably defined games, to arrive at the chain-rules for capacity. These chain-rules are of independent interest.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro