Comparing different subgradient methods for solving convex optimization problems with functional constraints

01/04/2021
by   Thi Lan Dinh, et al.
0

We provide a dual subgradient method and a primal-dual subgradient method for standard convex optimization problems with complexity 𝒪(ε^-2) and 𝒪(ε^-2r), for all r> 1, respectively. They are based on recent Metel-Takeda's work in [arXiv:2009.12769, 2020, pp. 1-12] and Boyd's method in [Lecture notes of EE364b, Stanford University, Spring 2013-14, pp. 1-39]. The efficiency of our methods is numerically illustrated in a comparison to the others.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro