IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2024i1p61-d1554807.html
   My bibliography  Save this article

Gradient Method with Step Adaptation

Author

Listed:
  • Vladimir Krutikov

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, 660037 Krasnoyarsk, Russia
    Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, 650043 Kemerovo, Russia)

  • Elena Tovbis

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, 660037 Krasnoyarsk, Russia)

  • Svetlana Gutova

    (Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, 650043 Kemerovo, Russia)

  • Ivan Rozhnov

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, 660037 Krasnoyarsk, Russia
    Laboratory “Hybrid Methods of Modeling and Optimization in Complex Systems”, Siberian Federal University, 79 Svobodny Prospekt, 660041 Krasnoyarsk, Russia)

  • Lev Kazakovtsev

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, 660037 Krasnoyarsk, Russia)

Abstract

The paper solves the problem of constructing step adjustment algorithms for a gradient method based on the principle of the steepest descent. The expansion of the step adjustment principle, its formalization and parameterization led the researchers to gradient-type methods with incomplete relaxation or over-relaxation. Such methods require only the gradient of the function to be calculated at the iteration. Optimization of the parameters of the step adaptation algorithms enables us to obtain methods that significantly exceed the steepest descent method in terms of convergence rate. In this paper, we present a universal step adjustment algorithm that does not require selecting optimal parameters. The algorithm is based on orthogonality of successive gradients and replacing complete relaxation with some degree of incomplete relaxation or over-relaxation. Its convergence rate corresponds to algorithms with optimization of the step adaptation algorithm parameters. In our experiments, on average, the proposed algorithm outperforms the steepest descent method by 2.7 times in the number of iterations. The advantage of the proposed methods is their operability under interference conditions. Our paper presents examples of solving test problems in which the interference values are uniformly distributed vectors in a ball with a radius 8 times greater than the gradient norm.

Suggested Citation

  • Vladimir Krutikov & Elena Tovbis & Svetlana Gutova & Ivan Rozhnov & Lev Kazakovtsev, 2024. "Gradient Method with Step Adaptation," Mathematics, MDPI, vol. 13(1), pages 1-35, December.
  • Handle: RePEc:gam:jmathe:v:13:y:2024:i:1:p:61-:d:1554807
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/1/61/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/1/61/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    2. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    3. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    4. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    5. Meruza Kubentayeva & Demyan Yarmoshik & Mikhail Persiianov & Alexey Kroshnin & Ekaterina Kotliarova & Nazarii Tupitsa & Dmitry Pasechnyuk & Alexander Gasnikov & Vladimir Shvetsov & Leonid Baryshev & A, 2024. "Primal-dual gradient methods for searching network equilibria in combined models with nested choice structure and capacity constraints," Computational Management Science, Springer, vol. 21(1), pages 1-33, June.
    6. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    7. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    8. Benjamin Grimmer, 2023. "General Hölder Smooth Convergence Rates Follow from Specialized Rates Assuming Growth Bounds," Journal of Optimization Theory and Applications, Springer, vol. 197(1), pages 51-70, April.
    9. Pham Duy Khanh & Boris S. Mordukhovich & Dat Ba Tran, 2024. "Inexact Reduced Gradient Methods in Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2138-2178, December.
    10. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
    11. Chin Pang Ho & Panos Parpas, 2019. "Empirical risk minimization: probabilistic complexity and stepsize strategy," Computational Optimization and Applications, Springer, vol. 73(2), pages 387-410, June.
    12. Anton Rodomanov & Yurii Nesterov, 2020. "Smoothness Parameter of Power of Euclidean Norm," Journal of Optimization Theory and Applications, Springer, vol. 185(2), pages 303-326, May.
    13. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
    14. Klimza, Anton & Gasnikov, Alexander & Stonyakin, Fedor & Alkousa, Mohammad, 2024. "Universal methods for variational inequalities: Deterministic and stochastic cases," Chaos, Solitons & Fractals, Elsevier, vol. 187(C).
    15. Alkousa, Mohammad & Stonyakin, Fedor & Gasnikov, Alexander & Abdo, Asmaa & Alcheikh, Mohammad, 2024. "Higher degree inexact model for optimization problems," Chaos, Solitons & Fractals, Elsevier, vol. 186(C).
    16. Hu, Yaohua & Li, Gongnong & Yu, Carisa Kwok Wai & Yip, Tsz Leung, 2022. "Quasi-convex feasibility problems: Subgradient methods and convergence rates," European Journal of Operational Research, Elsevier, vol. 298(1), pages 45-58.
    17. Meruza Kubentayeva & Alexander Gasnikov, 2021. "Finding Equilibria in the Traffic Assignment Problem with Primal-Dual Gradient Methods for Stable Dynamics Model and Beckmann Model," Mathematics, MDPI, vol. 9(11), pages 1-17, May.
    18. Eduard Gorbunov & Marina Danilova & Innokentiy Shibaev & Pavel Dvurechensky & Alexander Gasnikov, 2024. "High-Probability Complexity Bounds for Non-smooth Stochastic Convex Optimization with Heavy-Tailed Noise," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2679-2738, December.
    19. Masaru Ito & Mituhiro Fukuda, 2021. "Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 770-804, March.
    20. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2024:i:1:p:61-:d:1554807. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.