A zeroth-order proximal stochastic gradient method for weakly convex stochastic optimization

Spyridon Pougkakiotis (Lead / Corresponding author), Dionysis Kalogerias

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)
85 Downloads (Pure)

Abstract

In this paper we analyze a zeroth-order proximal stochastic gradient method suitable for the minimization of weakly convex stochastic optimization problems. We consider nonsmooth and nonlinear stochastic composite problems, for which (sub)gradient information might be unavailable. The proposed algorithm utilizes the well-known Gaussian smoothing technique, which yields unbiased zeroth-order gradient estimators of a related partially smooth surrogate problem (in which one of the two nonsmooth terms in the original problem's objective is replaced by a smooth approximation). This allows us to employ a standard proximal stochastic gradient scheme for the approximate solution of the surrogate problem, which is determined by a single smoothing parameter, and without the utilization of first-order information. We provide state-of-the-art convergence rates for the proposed zeroth-order method using minimal assumptions. The proposed scheme is numerically compared against alternative zeroth-order methods as well as a stochastic subgradient scheme on a standard phase retrieval problem. Further, we showcase the usefulness and effectiveness of our method in the unique setting of automated hyperparameter tuning. In particular, we focus on automatically tuning the parameters of optimization algorithms by minimizing a novel heuristic model. The proposed approach is tested on a proximal alternating direction method of multipliers for the solution of \scrL1/\scrL2-regularized PDE-constrained optimal control problems, with evident empirical success.

Original languageEnglish
Number of pages24
JournalSIAM Journal on Scientific Computing
Volume45
Issue number5
DOIs
Publication statusPublished - 16 Oct 2023

Keywords

  • composite optimization
  • hyperparameter tuning
  • stochastic gradient descent
  • weakly convex stochastic optimization
  • zeroth-order optimization

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A zeroth-order proximal stochastic gradient method for weakly convex stochastic optimization'. Together they form a unique fingerprint.
  • Z-ProxSG: Metatuning Software

    Pougkakiotis, S. (Artist), 2022

    Research output: Non-textual formSoftware

Cite this