IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v193y2022i1d10.1007_s10957-022-02038-7.html
   My bibliography  Save this article

Oracle Complexity Separation in Convex Optimization

Author

Listed:
  • Anastasiya Ivanova

    (National Research University Higher School of Economics
    Grenoble Alpes University)

  • Pavel Dvurechensky

    (Weierstrass Institute for Applied Analysis and Stochastics)

  • Evgeniya Vorontsova

    (Catholic University of Louvain)

  • Dmitry Pasechnyuk

    (Moscow Institute of Physics and Technology
    ISP RAS Research Center for Trusted Artificial Intelligence)

  • Alexander Gasnikov

    (National Research University Higher School of Economics
    Moscow Institute of Physics and Technology
    ISP RAS Research Center for Trusted Artificial Intelligence
    Institute for Information Transmission Problems)

  • Darina Dvinskikh

    (National Research University Higher School of Economics
    Moscow Institute of Physics and Technology
    ISP RAS Research Center for Trusted Artificial Intelligence)

  • Alexander Tyurin

    (National Research University Higher School of Economics)

Abstract

Many convex optimization problems have structured objective functions written as a sum of functions with different oracle types (e.g., full gradient, coordinate derivative, stochastic gradient) and different arithmetic operations complexity of these oracles. In the strongly convex case, these functions also have different condition numbers that eventually define the iteration complexity of first-order methods and the number of oracle calls required to achieve a given accuracy. Motivated by the desire to call more expensive oracles fewer times, we consider the problem of minimizing the sum of two functions and propose a generic algorithmic framework to separate oracle complexities for each function. The latter means that the oracle for each function is called the number of times that coincide with the oracle complexity for the case when the second function is absent. Our general accelerated framework covers the setting of (strongly) convex objectives, the setting when both parts are given through full coordinate oracle, as well as when one of them is given by coordinate derivative oracle or has the finite-sum structure and is available through stochastic gradient oracle. In the latter two cases, we obtain accelerated random coordinate descent and accelerated variance reduced methods with oracle complexity separation.

Suggested Citation

  • Anastasiya Ivanova & Pavel Dvurechensky & Evgeniya Vorontsova & Dmitry Pasechnyuk & Alexander Gasnikov & Darina Dvinskikh & Alexander Tyurin, 2022. "Oracle Complexity Separation in Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 462-490, June.
  • Handle: RePEc:spr:joptap:v:193:y:2022:i:1:d:10.1007_s10957-022-02038-7
    DOI: 10.1007/s10957-022-02038-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-022-02038-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-022-02038-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yurii Nesterov, 2018. "Lectures on Convex Optimization," Springer Optimization and Its Applications, Springer, edition 2, number 978-3-319-91578-4, December.
    2. NESTEROV, Yurii, 2012. "Efficiency of coordinate descent methods on huge-scale optimization problems," LIDAM Reprints CORE 2511, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Yurii NESTEROV & Sebastian U. STICH, 2017. "Efficiency of the accelerated coordinate descent method on structured optimization problems," LIDAM Reprints CORE 2845, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. A. Ghaffari-Hadigheh & L. Sinjorgo & R. Sotirov, 2024. "On convergence of a q-random coordinate constrained algorithm for non-convex problems," Journal of Global Optimization, Springer, vol. 90(4), pages 843-868, December.
    2. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
    3. Shota Takahashi & Mituhiro Fukuda & Mirai Tanaka, 2022. "New Bregman proximal type algorithms for solving DC optimization problems," Computational Optimization and Applications, Springer, vol. 83(3), pages 893-931, December.
    4. A. Scagliotti & P. Colli Franzone, 2022. "A piecewise conservative method for unconstrained convex optimization," Computational Optimization and Applications, Springer, vol. 81(1), pages 251-288, January.
    5. Xin Jiang & Lieven Vandenberghe, 2022. "Bregman primal–dual first-order method and application to sparse semidefinite programming," Computational Optimization and Applications, Springer, vol. 81(1), pages 127-159, January.
    6. Xuefei Lu & Alessandro Rudi & Emanuele Borgonovo & Lorenzo Rosasco, 2020. "Faster Kriging: Facing High-Dimensional Simulators," Operations Research, INFORMS, vol. 68(1), pages 233-249, January.
    7. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    8. Huiyi Cao & Kamil A. Khan, 2023. "General convex relaxations of implicit functions and inverse functions," Journal of Global Optimization, Springer, vol. 86(3), pages 545-572, July.
    9. Andrej Čopar & Blaž Zupan & Marinka Zitnik, 2019. "Fast optimization of non-negative matrix tri-factorization," PLOS ONE, Public Library of Science, vol. 14(6), pages 1-15, June.
    10. Duy Khuong Nguyen & Tu Bao Ho, 2017. "Accelerated parallel and distributed algorithm using limited internal memory for nonnegative matrix factorization," Journal of Global Optimization, Springer, vol. 68(2), pages 307-328, June.
    11. Abbaszadehpeivasti, Hadi & de Klerk, Etienne & Zamani, Moslem, 2023. "Convergence rate analysis of randomized and cyclic coordinate descent for convex optimization through semidefinite programming," Other publications TiSEM 88512ac0-c26a-4a99-b840-3, Tilburg University, School of Economics and Management.
    12. Francisco García Riesgo & Sergio Luis Suárez Gómez & Enrique Díez Alonso & Carlos González-Gutiérrez & Jesús Daniel Santos, 2021. "Fully Convolutional Approaches for Numerical Approximation of Turbulent Phases in Solar Adaptive Optics," Mathematics, MDPI, vol. 9(14), pages 1-20, July.
    13. Ion Necoara & Andrei Patrascu, 2014. "A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints," Computational Optimization and Applications, Springer, vol. 57(2), pages 307-337, March.
    14. Pavel Shcherbakov & Mingyue Ding & Ming Yuchi, 2021. "Random Sampling Many-Dimensional Sets Arising in Control," Mathematics, MDPI, vol. 9(5), pages 1-16, March.
    15. Liam Madden & Stephen Becker & Emiliano Dall’Anese, 2021. "Bounds for the Tracking Error of First-Order Online Optimization Methods," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 437-457, May.
    16. Shariat Torbaghan, Shahab & Madani, Mehdi & Sels, Peter & Virag, Ana & Le Cadre, Hélène & Kessels, Kris & Mou, Yuting, 2021. "Designing day-ahead multi-carrier markets for flexibility: Models and clearing algorithms," Applied Energy, Elsevier, vol. 285(C).
    17. Sjur Didrik Flåm, 2019. "Blocks of coordinates, stochastic programming, and markets," Computational Management Science, Springer, vol. 16(1), pages 3-16, February.
    18. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    19. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    20. Ion Necoara & Yurii Nesterov & François Glineur, 2017. "Random Block Coordinate Descent Methods for Linearly Constrained Optimization over Networks," Journal of Optimization Theory and Applications, Springer, vol. 173(1), pages 227-254, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:193:y:2022:i:1:d:10.1007_s10957-022-02038-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.