IDEAS home Printed from https://ideas.repec.org/a/eee/chsofr/v187y2024ics0960077924009706.html
   My bibliography  Save this article

Universal methods for variational inequalities: Deterministic and stochastic cases

Author

Listed:
  • Klimza, Anton
  • Gasnikov, Alexander
  • Stonyakin, Fedor
  • Alkousa, Mohammad

Abstract

In this paper, we propose universal proximal mirror methods to solve the variational inequality problem with Hölder-continuous operators in both deterministic and stochastic settings. The proposed methods automatically adapt not only to the oracle’s noise (in the stochastic setting of the problem) but also to the Hölder continuity of the operator without having prior knowledge of either the problem class or the nature of the operator information. We analyzed the proposed algorithms in both deterministic and stochastic settings and obtained estimates for the required number of iterations to achieve a given quality of a solution to the variational inequality. We showed that, without knowing the Hölder exponent and Hölder constant of the operators, the proposed algorithms have the least possible in the worst-case sense complexity for the considered class of variational inequalities. We also compared the resulting stochastic algorithm with other popular optimizers for the task of image classification.

Suggested Citation

  • Klimza, Anton & Gasnikov, Alexander & Stonyakin, Fedor & Alkousa, Mohammad, 2024. "Universal methods for variational inequalities: Deterministic and stochastic cases," Chaos, Solitons & Fractals, Elsevier, vol. 187(C).
  • Handle: RePEc:eee:chsofr:v:187:y:2024:i:c:s0960077924009706
    DOI: 10.1016/j.chaos.2024.115418
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0960077924009706
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.chaos.2024.115418?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    3. Rieger, Janosch & Tam, Matthew K., 2020. "Backward-Forward-Reflected-Backward Splitting for Three Operator Monotone Inclusions," Applied Mathematics and Computation, Elsevier, vol. 381(C).
    4. Cong Dang & Guanghui Lan, 2015. "On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators," Computational Optimization and Applications, Springer, vol. 60(2), pages 277-310, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    2. Duong Viet Thong & Aviv Gibali & Mathias Staudigl & Phan Tu Vuong, 2021. "Computing Dynamic User Equilibrium on Large-Scale Networks Without Knowing Global Parameters," Networks and Spatial Economics, Springer, vol. 21(3), pages 735-768, September.
    3. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    4. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    5. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    6. Zhen-Ping Yang & Gui-Hua Lin, 2021. "Variance-Based Single-Call Proximal Extragradient Algorithms for Stochastic Mixed Variational Inequalities," Journal of Optimization Theory and Applications, Springer, vol. 190(2), pages 393-427, August.
    7. Xiao-Juan Zhang & Xue-Wu Du & Zhen-Ping Yang & Gui-Hua Lin, 2019. "An Infeasible Stochastic Approximation and Projection Algorithm for Stochastic Variational Inequalities," Journal of Optimization Theory and Applications, Springer, vol. 183(3), pages 1053-1076, December.
    8. Boţ, R.I. & Csetnek, E.R. & Vuong, P.T., 2020. "The forward–backward–forward method from continuous and discrete perspective for pseudo-monotone variational inequalities in Hilbert spaces," European Journal of Operational Research, Elsevier, vol. 287(1), pages 49-60.
    9. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    10. Meruza Kubentayeva & Demyan Yarmoshik & Mikhail Persiianov & Alexey Kroshnin & Ekaterina Kotliarova & Nazarii Tupitsa & Dmitry Pasechnyuk & Alexander Gasnikov & Vladimir Shvetsov & Leonid Baryshev & A, 2024. "Primal-dual gradient methods for searching network equilibria in combined models with nested choice structure and capacity constraints," Computational Management Science, Springer, vol. 21(1), pages 1-33, June.
    11. Francisco J. Aragón Artacho & Rubén Campoy & Matthew K. Tam, 2021. "Strengthened splitting methods for computing resolvents," Computational Optimization and Applications, Springer, vol. 80(2), pages 549-585, November.
    12. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    13. Benjamin Grimmer, 2023. "General Hölder Smooth Convergence Rates Follow from Specialized Rates Assuming Growth Bounds," Journal of Optimization Theory and Applications, Springer, vol. 197(1), pages 51-70, April.
    14. Luis M. Briceño-Arias & Fernando Roldán, 2022. "Four-Operator Splitting via a Forward–Backward–Half-Forward Algorithm with Line Search," Journal of Optimization Theory and Applications, Springer, vol. 195(1), pages 205-225, October.
    15. Pham Duy Khanh & Boris S. Mordukhovich & Dat Ba Tran, 2024. "Inexact Reduced Gradient Methods in Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2138-2178, December.
    16. Ahmet Alacaoglu & Yura Malitsky & Volkan Cevher, 2021. "Forward-reflected-backward method with variance reduction," Computational Optimization and Applications, Springer, vol. 80(2), pages 321-346, November.
    17. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
    18. Chin Pang Ho & Panos Parpas, 2019. "Empirical risk minimization: probabilistic complexity and stepsize strategy," Computational Optimization and Applications, Springer, vol. 73(2), pages 387-410, June.
    19. Anton Rodomanov & Yurii Nesterov, 2020. "Smoothness Parameter of Power of Euclidean Norm," Journal of Optimization Theory and Applications, Springer, vol. 185(2), pages 303-326, May.
    20. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:chsofr:v:187:y:2024:i:c:s0960077924009706. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Thayer, Thomas R. (email available below). General contact details of provider: https://www.journals.elsevier.com/chaos-solitons-and-fractals .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.