IDEAS home Printed from https://ideas.repec.org/p/hal/journl/hal-03857182.html
   My bibliography  Save this paper

Asymptotic study of stochastic adaptive algorithm in non-convex landscape

Author

Listed:
  • Sébastien Gadat

    (TSE-R - Toulouse School of Economics - UT Capitole - Université Toulouse Capitole - UT - Université de Toulouse - EHESS - École des hautes études en sciences sociales - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement)

  • Ioana Gavra

    (IRMAR - Institut de Recherche Mathématique de Rennes - UR - Université de Rennes - INSA Rennes - Institut National des Sciences Appliquées - Rennes - INSA - Institut National des Sciences Appliquées - ENS Rennes - École normale supérieure - Rennes - UR2 - Université de Rennes 2 - CNRS - Centre National de la Recherche Scientifique - Institut Agro Rennes Angers - Institut Agro - Institut national d'enseignement supérieur pour l'agriculture, l'alimentation et l'environnement)

Abstract

This paper studies some asymptotic properties of adaptive algorithms widely used in optimization and machine learning, and among them Adagrad and Rmsprop, which are involved in most of the blackbox deep learning algorithms. Our setup is the non-convex landscape optimization point of view, we consider a one time scale parametrization and we consider the situation where these algorithms may be used or not with mini-batches. We adopt the point of view of stochastic algorithms and establish the almost sure convergence of these methods when using a decreasing step-size towards the set of critical points of the target function. With a mild extra assumption on the noise, we also obtain the convergence towards the set of minimizers of the function. Along our study, we also obtain a \convergence rate" of the methods, in the vein of the works of [GL13].

Suggested Citation

  • Sébastien Gadat & Ioana Gavra, 2022. "Asymptotic study of stochastic adaptive algorithm in non-convex landscape," Post-Print hal-03857182, HAL.
  • Handle: RePEc:hal:journl:hal-03857182
    Note: View the original document on HAL open archive server: https://hal.science/hal-03857182
    as

    Download full text from publisher

    File URL: https://hal.science/hal-03857182/document
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Heinz H. Bauschke & Jérôme Bolte & Marc Teboulle, 2017. "A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications," Mathematics of Operations Research, INFORMS, vol. 42(2), pages 330-348, May.
    2. Costa, Manon & Gadat, Sébastien & Bercu, Bernard, 2020. "Stochastic approximation algorithms for superquantiles estimation," TSE Working Papers 20-1142, Toulouse School of Economics (TSE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gadat, Sébastien & Gavra, Ioana, 2021. "Asymptotic study of stochastic adaptive algorithm in non-convex landscape," TSE Working Papers 21-1175, Toulouse School of Economics (TSE).
    2. Shota Takahashi & Mituhiro Fukuda & Mirai Tanaka, 2022. "New Bregman proximal type algorithms for solving DC optimization problems," Computational Optimization and Applications, Springer, vol. 83(3), pages 893-931, December.
    3. HyungSeon Oh, 2021. "Distributed optimal power flow," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-27, June.
    4. Zehui Jia & Jieru Huang & Xingju Cai, 2021. "Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems," Journal of Global Optimization, Springer, vol. 80(4), pages 841-864, August.
    5. Fan Wu & Wei Bian, 2023. "Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Convex Optimization Beyond Differentiability," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 539-572, May.
    6. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    7. Emanuel Laude & Peter Ochs & Daniel Cremers, 2020. "Bregman Proximal Mappings and Bregman–Moreau Envelopes Under Relative Prox-Regularity," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 724-761, March.
    8. Yin Liu & Sam Davanloo Tajbakhsh, 2023. "Stochastic Composition Optimization of Functions Without Lipschitz Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 239-289, July.
    9. Radu-Alexandru Dragomir & Alexandre d’Aspremont & Jérôme Bolte, 2021. "Quartic First-Order Methods for Low-Rank Minimization," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 341-363, May.
    10. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
    11. Xiantao Xiao, 2021. "A Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 605-627, March.
    12. Filip Hanzely & Peter Richtárik, 2021. "Fastest rates for stochastic mirror descent methods," Computational Optimization and Applications, Springer, vol. 79(3), pages 717-766, July.
    13. Abbaszadehpeivasti, Hadi, 2024. "Performance analysis of optimization methods for machine learning," Other publications TiSEM 3050a62d-1a1f-494e-99ef-7, Tilburg University, School of Economics and Management.
    14. Yunier Bello-Cruz & Guoyin Li & Tran Thai An Nghia, 2022. "Quadratic Growth Conditions and Uniqueness of Optimal Solution to Lasso," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 167-190, July.
    15. Yunier Bello-Cruz & Guoyin Li & Tran T. A. Nghia, 2021. "On the Linear Convergence of Forward–Backward Splitting Method: Part I—Convergence Analysis," Journal of Optimization Theory and Applications, Springer, vol. 188(2), pages 378-401, February.
    16. Xin Jiang & Lieven Vandenberghe, 2023. "Bregman Three-Operator Splitting Methods," Journal of Optimization Theory and Applications, Springer, vol. 196(3), pages 936-972, March.
    17. Regina S. Burachik & Yaohua Hu & Xiaoqi Yang, 2022. "Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in hilbert spaces," Journal of Global Optimization, Springer, vol. 83(2), pages 249-271, June.
    18. Zamani, Moslem & Abbaszadehpeivasti, Hadi & de Klerk, Etienne, 2024. "The exact worst-case convergence rate of the alternating direction method of multipliers," Other publications TiSEM f30ae9e6-ed19-423f-bd1e-0, Tilburg University, School of Economics and Management.
    19. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    20. S. Bonettini & M. Prato & S. Rebegoldi, 2018. "A block coordinate variable metric linesearch based proximal gradient method," Computational Optimization and Applications, Springer, vol. 71(1), pages 5-52, September.

    More about this item

    Keywords

    Stochastic optimization; Stochastic adaptive algorithm; Convergence of random variables;
    All these keywords.

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:journl:hal-03857182. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.