Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss

05/04/2021
by   Yair Carmon, et al.
0

We characterize the complexity of minimizing max_i∈[N] f_i(x) for convex, Lipschitz functions f_1,…, f_N. For non-smooth functions, existing methods require O(Nϵ^-2) queries to a first-order oracle to compute an ϵ-suboptimal point and Õ(Nϵ^-1) queries if the f_i are O(1/ϵ)-smooth. We develop methods with improved complexity bounds of Õ(Nϵ^-2/3 + ϵ^-8/3) in the non-smooth case and Õ(Nϵ^-2/3 + √(N)ϵ^-1) in the O(1/ϵ)-smooth case. Our methods consist of a recently proposed ball optimization oracle acceleration algorithm (which we refine) and a careful implementation of said oracle for the softmax function. We also prove an oracle complexity lower bound scaling as Ω(Nϵ^-2/3), showing that our dependence on N is optimal up to polylogarithmic factors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro