IDEAS home Printed from https://ideas.repec.org/a/eee/chsofr/v187y2024ics0960077924009706.html
   My bibliography  Save this article

Universal methods for variational inequalities: Deterministic and stochastic cases

Author

Listed:
  • Klimza, Anton
  • Gasnikov, Alexander
  • Stonyakin, Fedor
  • Alkousa, Mohammad

Abstract

In this paper, we propose universal proximal mirror methods to solve the variational inequality problem with Hölder-continuous operators in both deterministic and stochastic settings. The proposed methods automatically adapt not only to the oracle’s noise (in the stochastic setting of the problem) but also to the Hölder continuity of the operator without having prior knowledge of either the problem class or the nature of the operator information. We analyzed the proposed algorithms in both deterministic and stochastic settings and obtained estimates for the required number of iterations to achieve a given quality of a solution to the variational inequality. We showed that, without knowing the Hölder exponent and Hölder constant of the operators, the proposed algorithms have the least possible in the worst-case sense complexity for the considered class of variational inequalities. We also compared the resulting stochastic algorithm with other popular optimizers for the task of image classification.

Suggested Citation

  • Klimza, Anton & Gasnikov, Alexander & Stonyakin, Fedor & Alkousa, Mohammad, 2024. "Universal methods for variational inequalities: Deterministic and stochastic cases," Chaos, Solitons & Fractals, Elsevier, vol. 187(C).
  • Handle: RePEc:eee:chsofr:v:187:y:2024:i:c:s0960077924009706
    DOI: 10.1016/j.chaos.2024.115418
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0960077924009706
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.chaos.2024.115418?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    3. Rieger, Janosch & Tam, Matthew K., 2020. "Backward-Forward-Reflected-Backward Splitting for Three Operator Monotone Inclusions," Applied Mathematics and Computation, Elsevier, vol. 381(C).
    4. Cong Dang & Guanghui Lan, 2015. "On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators," Computational Optimization and Applications, Springer, vol. 60(2), pages 277-310, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    2. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    3. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    4. Xiao-Juan Zhang & Xue-Wu Du & Zhen-Ping Yang & Gui-Hua Lin, 2019. "An Infeasible Stochastic Approximation and Projection Algorithm for Stochastic Variational Inequalities," Journal of Optimization Theory and Applications, Springer, vol. 183(3), pages 1053-1076, December.
    5. Boţ, R.I. & Csetnek, E.R. & Vuong, P.T., 2020. "The forward–backward–forward method from continuous and discrete perspective for pseudo-monotone variational inequalities in Hilbert spaces," European Journal of Operational Research, Elsevier, vol. 287(1), pages 49-60.
    6. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    7. Francisco J. Aragón Artacho & Rubén Campoy & Matthew K. Tam, 2021. "Strengthened splitting methods for computing resolvents," Computational Optimization and Applications, Springer, vol. 80(2), pages 549-585, November.
    8. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    9. Luis M. Briceño-Arias & Fernando Roldán, 2022. "Four-Operator Splitting via a Forward–Backward–Half-Forward Algorithm with Line Search," Journal of Optimization Theory and Applications, Springer, vol. 195(1), pages 205-225, October.
    10. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
    11. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
    12. Alkousa, Mohammad & Stonyakin, Fedor & Gasnikov, Alexander & Abdo, Asmaa & Alcheikh, Mohammad, 2024. "Higher degree inexact model for optimization problems," Chaos, Solitons & Fractals, Elsevier, vol. 186(C).
    13. Hu, Yaohua & Li, Gongnong & Yu, Carisa Kwok Wai & Yip, Tsz Leung, 2022. "Quasi-convex feasibility problems: Subgradient methods and convergence rates," European Journal of Operational Research, Elsevier, vol. 298(1), pages 45-58.
    14. Vladimir Krutikov & Elena Tovbis & Svetlana Gutova & Ivan Rozhnov & Lev Kazakovtsev, 2024. "Gradient Method with Step Adaptation," Mathematics, MDPI, vol. 13(1), pages 1-35, December.
    15. Aswin Kannan & Uday V. Shanbhag, 2019. "Optimal stochastic extragradient schemes for pseudomonotone stochastic variational inequality problems and their variants," Computational Optimization and Applications, Springer, vol. 74(3), pages 779-820, December.
    16. Masaru Ito & Mituhiro Fukuda, 2021. "Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 770-804, March.
    17. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
    18. Elena Tovbis & Vladimir Krutikov & Lev Kazakovtsev, 2024. "Newtonian Property of Subgradient Method with Optimization of Metric Matrix Parameter Correction," Mathematics, MDPI, vol. 12(11), pages 1-27, May.
    19. Rubén Campoy, 2022. "A product space reformulation with reduced dimension for splitting algorithms," Computational Optimization and Applications, Springer, vol. 83(1), pages 319-348, September.
    20. Duong Viet Thong & Aviv Gibali & Mathias Staudigl & Phan Tu Vuong, 2021. "Computing Dynamic User Equilibrium on Large-Scale Networks Without Knowing Global Parameters," Networks and Spatial Economics, Springer, vol. 21(3), pages 735-768, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:chsofr:v:187:y:2024:i:c:s0960077924009706. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Thayer, Thomas R. (email available below). General contact details of provider: https://www.journals.elsevier.com/chaos-solitons-and-fractals .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.