IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v178y2018i2d10.1007_s10957-018-1298-1.html
   My bibliography  Save this article

Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization

Author

Listed:
  • Adrien B. Taylor

    (Université catholique de Louvain)

  • Julien M. Hendrickx

    (Université catholique de Louvain)

  • François Glineur

    (Université catholique de Louvain
    Université catholique de Louvain)

Abstract

We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function, whose proximal operator is available. We establish the exact worst-case convergence rates of the proximal gradient method in this setting for any step size and for different standard performance measures: objective function accuracy, distance to optimality and residual gradient norm. The proof methodology relies on recent developments in performance estimation of first-order methods, based on semidefinite programming. In the case of the proximal gradient method, this methodology allows obtaining exact and non-asymptotic worst-case guarantees that are conceptually very simple, although apparently new. On the way, we discuss how strong convexity can be replaced by weaker assumptions, while preserving the corresponding convergence rates. We also establish that the same fixed step size policy is optimal for all three performance measures. Finally, we extend recent results on the worst-case behavior of gradient descent with exact line search to the proximal case.

Suggested Citation

  • Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
  • Handle: RePEc:spr:joptap:v:178:y:2018:i:2:d:10.1007_s10957-018-1298-1
    DOI: 10.1007/s10957-018-1298-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-018-1298-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-018-1298-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Taylor, A. & Hendrickx, J. & Glineur, F., 2015. "Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-order Methods," LIDAM Discussion Papers CORE 2015013, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Patrick L. Combettes & Jean-Christophe Pesquet, 2011. "Proximal Splitting Methods in Signal Processing," Springer Optimization and Its Applications, in: Heinz H. Bauschke & Regina S. Burachik & Patrick L. Combettes & Veit Elser & D. Russell Luke & Henry (ed.), Fixed-Point Algorithms for Inverse Problems in Science and Engineering, chapter 0, pages 185-212, Springer.
    4. NESTEROV, Yurii, 2012. "Efficiency of coordinate descent methods on huge-scale optimization problems," LIDAM Reprints CORE 2511, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    5. Ion Necoara & Yurii Nesterov & François Glineur, 2019. "Linear convergence of first order methods for non-strongly convex optimization," LIDAM Reprints CORE 3000, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    6. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    7. NESTEROV, Yurii, 2013. "Gradient methods for minimizing composite functions," LIDAM Reprints CORE 2510, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    8. DE KLERK, Etienne & GLINEUR, François & TAYLOR, Adrien B., 2016. "On the Worst-case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions," LIDAM Discussion Papers CORE 2016027, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Abbaszadehpeivasti, Hadi, 2024. "Performance analysis of optimization methods for machine learning," Other publications TiSEM 3050a62d-1a1f-494e-99ef-7, Tilburg University, School of Economics and Management.
    2. André Uschmajew & Bart Vandereycken, 2022. "A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 364-373, July.
    3. Sandra S. Y. Tan & Antonios Varvitsiotis & Vincent Y. F. Tan, 2021. "Analysis of Optimization Algorithms via Sum-of-Squares," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 56-81, July.
    4. Wei Peng & Hui Zhang & Xiaoya Zhang & Lizhi Cheng, 2020. "Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions," Journal of Global Optimization, Springer, vol. 78(1), pages 69-89, September.
    5. Guoyong Gu & Junfeng Yang, 2024. "Tight Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Monotone Variational Inequalities," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 373-387, July.
    6. Donghwan Kim & Jeffrey A. Fessler, 2021. "Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 192-219, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Sandra S. Y. Tan & Antonios Varvitsiotis & Vincent Y. F. Tan, 2021. "Analysis of Optimization Algorithms via Sum-of-Squares," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 56-81, July.
    3. Olivier Fercoq & Zheng Qu, 2020. "Restarting the accelerated coordinate descent method with a rough strong convexity estimate," Computational Optimization and Applications, Springer, vol. 75(1), pages 63-91, January.
    4. Abbaszadehpeivasti, Hadi & de Klerk, Etienne & Zamani, Moslem, 2022. "The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions," Other publications TiSEM 061688c6-f97c-4024-bb5b-1, Tilburg University, School of Economics and Management.
    5. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
    6. Hadi Abbaszadehpeivasti & Etienne Klerk & Moslem Zamani, 2024. "On the Rate of Convergence of the Difference-of-Convex Algorithm (DCA)," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 475-496, July.
    7. Guoyong Gu & Junfeng Yang, 2024. "Tight Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Monotone Variational Inequalities," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 373-387, July.
    8. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    9. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    10. Julian Rasch & Antonin Chambolle, 2020. "Inexact first-order primal–dual algorithms," Computational Optimization and Applications, Springer, vol. 76(2), pages 381-430, June.
    11. Sun, Shilin & Wang, Tianyang & Yang, Hongxing & Chu, Fulei, 2022. "Damage identification of wind turbine blades using an adaptive method for compressive beamforming based on the generalized minimax-concave penalty function," Renewable Energy, Elsevier, vol. 181(C), pages 59-70.
    12. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    13. Ching-pei Lee & Stephen J. Wright, 2019. "Inexact Successive quadratic approximation for regularized optimization," Computational Optimization and Applications, Springer, vol. 72(3), pages 641-674, April.
    14. Le Thi Khanh Hien & Cuong V. Nguyen & Huan Xu & Canyi Lu & Jiashi Feng, 2019. "Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 541-566, May.
    15. Kimon Fountoulakis & Rachael Tappenden, 2018. "A flexible coordinate descent method," Computational Optimization and Applications, Springer, vol. 70(2), pages 351-394, June.
    16. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    17. Reza Eghbali & Maryam Fazel, 2017. "Decomposable norm minimization with proximal-gradient homotopy algorithm," Computational Optimization and Applications, Springer, vol. 66(2), pages 345-381, March.
    18. Patrick R. Johnstone & Pierre Moulin, 2017. "Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 67(2), pages 259-292, June.
    19. Majid Jahani & Naga Venkata C. Gudapati & Chenxin Ma & Rachael Tappenden & Martin Takáč, 2021. "Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences," Computational Optimization and Applications, Springer, vol. 79(2), pages 369-404, June.
    20. Bo Wen & Xiaojun Chen & Ting Kei Pong, 2018. "A proximal difference-of-convex algorithm with extrapolation," Computational Optimization and Applications, Springer, vol. 69(2), pages 297-324, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:178:y:2018:i:2:d:10.1007_s10957-018-1298-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.