IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v203y2024i3d10.1007_s10957-024-02533-z.html
   My bibliography  Save this article

High-Probability Complexity Bounds for Non-smooth Stochastic Convex Optimization with Heavy-Tailed Noise

Author

Listed:
  • Eduard Gorbunov

    (Mohamed bin Zayed University of Artificial Intelligence)

  • Marina Danilova

    (Institute of Control Sciences RAS
    Moscow Institute of Physics and Technology)

  • Innokentiy Shibaev

    (Moscow Institute of Physics and Technology
    National Research University Higher School of Economics)

  • Pavel Dvurechensky

    (Weierstrass Institute for Applied Analysis and Stochastics)

  • Alexander Gasnikov

    (Moscow Institute of Physics and Technology
    Innopolis University
    Institute for Information Transmission Problems RAS)

Abstract

Stochastic first-order methods are standard for training large-scale machine learning models. Random behavior may cause a particular run of an algorithm to result in a highly suboptimal objective value, whereas theoretical guarantees are usually proved for the expectation of the objective value. Thus, it is essential to theoretically guarantee that algorithms provide small objective residuals with high probability. Existing methods for non-smooth stochastic convex optimization have complexity bounds with the dependence on the confidence level that is either negative-power or logarithmic but under an additional assumption of sub-Gaussian (light-tailed) noise distribution that may not hold in practice. In our paper, we resolve this issue and derive the first high-probability convergence results with logarithmic dependence on the confidence level for non-smooth convex stochastic optimization problems with non-sub-Gaussian (heavy-tailed) noise. To derive our results, we propose novel stepsize rules for two stochastic methods with gradient clipping. Moreover, our analysis works for generalized smooth objectives with Hölder-continuous gradients, and for both methods, we provide an extension for strongly convex problems. Finally, our results imply that the first (accelerated) method we consider also has optimal iteration and oracle complexity in all the regimes, and the second one is optimal in the non-smooth setting.

Suggested Citation

  • Eduard Gorbunov & Marina Danilova & Innokentiy Shibaev & Pavel Dvurechensky & Alexander Gasnikov, 2024. "High-Probability Complexity Bounds for Non-smooth Stochastic Convex Optimization with Heavy-Tailed Noise," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2679-2738, December.
  • Handle: RePEc:spr:joptap:v:203:y:2024:i:3:d:10.1007_s10957-024-02533-z
    DOI: 10.1007/s10957-024-02533-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-024-02533-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-024-02533-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Pavel Dvurechensky & Alexander Gasnikov, 2016. "Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 171(1), pages 121-145, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    2. Alkousa, Mohammad & Stonyakin, Fedor & Gasnikov, Alexander & Abdo, Asmaa & Alcheikh, Mohammad, 2024. "Higher degree inexact model for optimization problems," Chaos, Solitons & Fractals, Elsevier, vol. 186(C).
    3. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    4. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    5. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
    6. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    7. Tianxiao Sun & Ion Necoara & Quoc Tran-Dinh, 2020. "Composite convex optimization with global and local inexact oracles," Computational Optimization and Applications, Springer, vol. 76(1), pages 69-124, May.
    8. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
    9. Vladimir Krutikov & Svetlana Gutova & Elena Tovbis & Lev Kazakovtsev & Eugene Semenkin, 2022. "Relaxation Subgradient Algorithms with Machine Learning Procedures," Mathematics, MDPI, vol. 10(21), pages 1-33, October.
    10. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    11. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    12. Xuexue Zhang & Sanyang Liu & Nannan Zhao, 2023. "An Extended Gradient Method for Smooth and Strongly Convex Functions," Mathematics, MDPI, vol. 11(23), pages 1-14, November.
    13. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2013. "First-order methods with inexact oracle: the strongly convex case," LIDAM Discussion Papers CORE 2013016, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    14. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    15. Le Thi Khanh Hien & Cuong V. Nguyen & Huan Xu & Canyi Lu & Jiashi Feng, 2019. "Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 541-566, May.
    16. DEVOLDER, Olivier, 2011. "Stochastic first order methods in smooth convex optimization," LIDAM Discussion Papers CORE 2011070, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    17. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    18. J. O. Royset & E. Y. Pee, 2012. "Rate of Convergence Analysis of Discretization and Smoothing Algorithms for Semiinfinite Minimax Problems," Journal of Optimization Theory and Applications, Springer, vol. 155(3), pages 855-882, December.
    19. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
    20. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:203:y:2024:i:3:d:10.1007_s10957-024-02533-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.