• Media type: E-Article
  • Title: Global convergence of model function based Bregman proximal minimization algorithms
  • Contributor: Mukkamala, Mahesh Chandra; Fadili, Jalal; Ochs, Peter
  • Published: Springer Science and Business Media LLC, 2022
  • Published in: Journal of Global Optimization, 83 (2022) 4, Seite 753-781
  • Language: English
  • DOI: 10.1007/s10898-021-01114-y
  • ISSN: 0925-5001; 1573-2916
  • Origination:
  • Footnote:
  • Description: AbstractLipschitz continuity of the gradient mapping of a continuously differentiable function plays a crucial role in designing various optimization algorithms. However, many functions arising in practical applications such as low rank matrix factorization or deep neural network problems do not have a Lipschitz continuous gradient. This led to the development of a generalized notion known as the L-smad property, which is based on generalized proximity measures called Bregman distances. However, the L-smad property cannot handle nonsmooth functions, for example, simple nonsmooth functions like $$\vert x^4-1 \vert $$ | x 4 - 1 | and also many practical composite problems are out of scope. We fix this issue by proposing the MAP property, which generalizes the L-smad property and is also valid for a large class of structured nonconvex nonsmooth composite problems. Based on the proposed MAP property, we propose a globally convergent algorithm called Model BPG, that unifies several existing algorithms. The convergence analysis is based on a new Lyapunov function. We also numerically illustrate the superior performance of Model BPG on standard phase retrieval problems and Poisson linear inverse problems, when compared to a state of the art optimization method that is valid for generic nonconvex nonsmooth optimization problems.