Lower Bounds for Sparse Oblivious Subspace Embeddings

12/21/2021
by   Yi Li, et al.
0

An oblivious subspace embedding (OSE), characterized by parameters m,n,d,ϵ,δ, is a random matrix Π∈ℝ^m× n such that for any d-dimensional subspace T⊆ℝ^n, _Π[∀ x∈ T, (1-ϵ)x_2 ≤Π x_2≤ (1+ϵ)x_2] ≥ 1-δ. For ϵ and δ at most a small constant, we show that any OSE with one nonzero entry in each column must satisfy that m = Ω(d^2/(ϵ^2δ)), establishing the optimality of the classical Count-Sketch matrix. When an OSE has 1/(9ϵ) nonzero entries in each column, we show it must hold that m = Ω(ϵ^O(δ) d^2), improving on the previous Ω(ϵ^2 d^2) lower bound due to Nelson and Nguyen (ICALP 2014).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro