Optimal Online Discrepancy Minimization

08/02/2023
by   Janardhan Kulkarni, et al.
0

We prove that there exists an online algorithm that for any sequence of vectors v_1,…,v_T ∈ℝ^n with v_i_2 ≤ 1, arriving one at a time, decides random signs x_1,…,x_T ∈{ -1,1} so that for every t ≤ T, the prefix sum ∑_i=1^t x_iv_i is 10-subgaussian. This improves over the work of Alweiss, Liu and Sawhney who kept prefix sums O(√(log (nT)))-subgaussian, and gives a O(√(log T)) bound on the discrepancy max_t ∈ T∑_i=1^t x_i v_i_∞. Our proof combines a generalization of Banaszczyk's prefix balancing result to trees with a cloning argument to find distributions rather than single colorings. We also show a matching Ω(√(log T)) strategy for an oblivious adversary.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro