• Media type: Report; Text; E-Book
  • Title: Oracle complexity separation in convex optimization
  • Contributor: Ivanova, Anastasiya [Author]; Gasnikov, Alexander [Author]; Dvurechensky, Pavel [Author]; Dvinskikh, Darina [Author]; Tyurin, Alexander [Author]; Vorontsova, Evgeniya [Author]; Pasechnyuk, Dmitry [Author]
  • imprint: Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020
  • Issue: published Version
  • Language: English
  • DOI: https://doi.org/10.34657/8399; https://doi.org/10.20347/WIAS.PREPRINT.2711
  • ISSN: 2198-5855
  • Keywords: variance reduction ; acceleration ; random coordinate descent ; Convex optimization ; proximal method ; composite optimization
  • Origination:
  • Footnote: Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
  • Description: Ubiquitous in machine learning regularized empirical risk minimization problems are often composed of several blocks which can be treated using different types of oracles, e.g., full gradient, stochastic gradient or coordinate derivative. Optimal oracle complexity is known and achievable separately for the full gradient case, the stochastic gradient case, etc. We propose a generic framework to combine optimal algorithms for different types of oracles in order to achieve separate optimal oracle complexity for each block, i.e. for each block the corresponding oracle is called the optimal number of times for a given accuracy. As a particular example, we demonstrate that for a combination of a full gradient oracle and either a stochastic gradient oracle or a coordinate descent oracle our approach leads to the optimal number of oracle calls separately for the full gradient part and the stochastic/coordinate descent part.
  • Access State: Open Access