Higher-Order Accelerated Methods for Faster Non-Smooth Optimization

06/04/2019
by   Brian Bullins, et al.
0

We provide improved convergence rates for various non-smooth optimization problems via higher-order accelerated methods. In the case of ℓ_∞ regression, we achieves an O(ϵ^-4/5) iteration complexity, breaking the O(ϵ^-1) barrier so far present for previous methods. We arrive at a similar rate for the problem of ℓ_1-SVM, going beyond what is attainable by first-order methods with prox-oracle access for non-smooth non-strongly convex problems. We further show how to achieve even faster rates by introducing higher-order regularization. Our results rely on recent advances in near-optimal accelerated methods for higher-order smooth convex optimization. In particular, we extend Nesterov's smoothing technique to show that the standard softmax approximation is not only smooth in the usual sense, but also higher-order smooth. With this observation in hand, we provide the first example of higher-order acceleration techniques yielding faster rates for non-smooth optimization, to the best of our knowledge.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro