IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v202y2024i1d10.1007_s10957-022-02058-3.html
   My bibliography  Save this article

Tight Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Monotone Variational Inequalities

Author

Listed:
  • Guoyong Gu

    (Nanjing University)

  • Junfeng Yang

    (Nanjing University)

Abstract

This paper considers the relaxed proximal point algorithm for solving monotone variational inequality problems, and our main contribution is the establishment of a tight ergodic sublinear convergence rate. First, the tight or exact worst-case convergence rate is computed using the performance estimation framework. It is observed that this numerical bound asymptotically coincides with the best-known existing rate, whose tightness is not clear. This implies that, without further assumptions, sublinear convergence rate is likely the best achievable rate for the relaxed proximal point algorithm. Motivated by the numerical result, a concrete example is constructed, which provides a lower bound on the exact worst-case convergence rate. This lower bound coincides with the numerical bound computed via the performance estimation framework, leading us to conjecture that the lower bound provided by the example is exactly the tight worse-case rate, which is then verified theoretically. We thus have established an ergodic sublinear complexity rate that is tight in terms of both the sublinear order and the constants involved.

Suggested Citation

  • Guoyong Gu & Junfeng Yang, 2024. "Tight Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Monotone Variational Inequalities," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 373-387, July.
  • Handle: RePEc:spr:joptap:v:202:y:2024:i:1:d:10.1007_s10957-022-02058-3
    DOI: 10.1007/s10957-022-02058-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-022-02058-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-022-02058-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Taylor, A. & Hendrickx, J. & Glineur, F., 2015. "Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-order Methods," LIDAM Discussion Papers CORE 2015013, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    3. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. Donghwan Kim & Jeffrey A. Fessler, 2021. "Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 192-219, January.
    5. Guoyong Gu & Bingsheng He & Xiaoming Yuan, 2014. "Customized proximal point algorithms for linearly constrained convex minimization and saddle-point problems: a unified approach," Computational Optimization and Applications, Springer, vol. 59(1), pages 135-161, October.
    6. Donghwan Kim & Jeffrey A. Fessler, 2017. "On the Convergence Analysis of the Optimized Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 172(1), pages 187-205, January.
    7. DE KLERK, Etienne & GLINEUR, François & TAYLOR, Adrien B., 2016. "On the Worst-case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions," LIDAM Discussion Papers CORE 2016027, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sandra S. Y. Tan & Antonios Varvitsiotis & Vincent Y. F. Tan, 2021. "Analysis of Optimization Algorithms via Sum-of-Squares," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 56-81, July.
    2. Donghwan Kim & Jeffrey A. Fessler, 2021. "Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 192-219, January.
    3. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    4. Abbaszadehpeivasti, Hadi, 2024. "Performance analysis of optimization methods for machine learning," Other publications TiSEM 3050a62d-1a1f-494e-99ef-7, Tilburg University, School of Economics and Management.
    5. Abbaszadehpeivasti, Hadi & de Klerk, Etienne & Zamani, Moslem, 2022. "The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions," Other publications TiSEM 061688c6-f97c-4024-bb5b-1, Tilburg University, School of Economics and Management.
    6. Hadi Abbaszadehpeivasti & Etienne Klerk & Moslem Zamani, 2024. "On the Rate of Convergence of the Difference-of-Convex Algorithm (DCA)," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 475-496, July.
    7. André Uschmajew & Bart Vandereycken, 2022. "A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 364-373, July.
    8. Roland Hildebrand, 2021. "Optimal step length for the Newton method: case of self-concordant functions," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 94(2), pages 253-279, October.
    9. A. Scagliotti & P. Colli Franzone, 2022. "A piecewise conservative method for unconstrained convex optimization," Computational Optimization and Applications, Springer, vol. 81(1), pages 251-288, January.
    10. Feng Ma, 2019. "On relaxation of some customized proximal point algorithms for convex minimization: from variational inequality perspective," Computational Optimization and Applications, Springer, vol. 73(3), pages 871-901, July.
    11. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    12. Yanqin Bai & Xiao Han & Tong Chen & Hua Yu, 2015. "Quadratic kernel-free least squares support vector machine for target diseases classification," Journal of Combinatorial Optimization, Springer, vol. 30(4), pages 850-870, November.
    13. Liusheng Hou & Hongjin He & Junfeng Yang, 2016. "A partially parallel splitting method for multiple-block separable convex programming with applications to robust PCA," Computational Optimization and Applications, Springer, vol. 63(1), pages 273-303, January.
    14. Ernest K. Ryu & Bằng Công Vũ, 2020. "Finding the Forward-Douglas–Rachford-Forward Method," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 858-876, March.
    15. Eike Börgens & Christian Kanzow, 2019. "Regularized Jacobi-type ADMM-methods for a class of separable convex optimization problems in Hilbert spaces," Computational Optimization and Applications, Springer, vol. 73(3), pages 755-790, July.
    16. Hongjin He & Jitamitra Desai & Kai Wang, 2016. "A primal–dual prediction–correction algorithm for saddle point optimization," Journal of Global Optimization, Springer, vol. 66(3), pages 573-583, November.
    17. Donghwan Kim & Jeffrey A. Fessler, 2018. "Adaptive Restart of the Optimized Gradient Method for Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 178(1), pages 240-263, July.
    18. Wei Peng & Hui Zhang & Xiaoya Zhang & Lizhi Cheng, 2020. "Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions," Journal of Global Optimization, Springer, vol. 78(1), pages 69-89, September.
    19. Donghwan Kim & Jeffrey A. Fessler, 2017. "On the Convergence Analysis of the Optimized Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 172(1), pages 187-205, January.
    20. Ying Gao & Wenxing Zhang, 2023. "An alternative extrapolation scheme of PDHGM for saddle point problem with nonlinear function," Computational Optimization and Applications, Springer, vol. 85(1), pages 263-291, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:202:y:2024:i:1:d:10.1007_s10957-022-02058-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.