Research output per year
Research output per year
Spyridon Pougkakiotis (Lead / Corresponding author), Dionysis Kalogerias
Research output: Contribution to journal › Article › peer-review
In this paper we analyze a zeroth-order proximal stochastic gradient method suitable for the minimization of weakly convex stochastic optimization problems. We consider nonsmooth and nonlinear stochastic composite problems, for which (sub)gradient information might be unavailable. The proposed algorithm utilizes the well-known Gaussian smoothing technique, which yields unbiased zeroth-order gradient estimators of a related partially smooth surrogate problem (in which one of the two nonsmooth terms in the original problem's objective is replaced by a smooth approximation). This allows us to employ a standard proximal stochastic gradient scheme for the approximate solution of the surrogate problem, which is determined by a single smoothing parameter, and without the utilization of first-order information. We provide state-of-the-art convergence rates for the proposed zeroth-order method using minimal assumptions. The proposed scheme is numerically compared against alternative zeroth-order methods as well as a stochastic subgradient scheme on a standard phase retrieval problem. Further, we showcase the usefulness and effectiveness of our method in the unique setting of automated hyperparameter tuning. In particular, we focus on automatically tuning the parameters of optimization algorithms by minimizing a novel heuristic model. The proposed approach is tested on a proximal alternating direction method of multipliers for the solution of \scrL1/\scrL2-regularized PDE-constrained optimal control problems, with evident empirical success.
Original language | English |
---|---|
Number of pages | 24 |
Journal | SIAM Journal on Scientific Computing |
Volume | 45 |
Issue number | 5 |
DOIs | |
Publication status | Published - 16 Oct 2023 |
Research output: Non-textual form › Software