IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v12y2024i11p1618-d1399091.html
   My bibliography  Save this article

Newtonian Property of Subgradient Method with Optimization of Metric Matrix Parameter Correction

Author

Listed:
  • Elena Tovbis

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31, Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia)

  • Vladimir Krutikov

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31, Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia
    Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, Kemerovo 650043, Russia)

  • Lev Kazakovtsev

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31, Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia)

Abstract

The work proves that under conditions of instability of the second derivatives of the function in the minimization region, the estimate of the convergence rate of Newton’s method is determined by the parameters of the irreducible part of the conditionality degree of the problem. These parameters represent the degree of difference between eigenvalues of the matrices of the second derivatives in the coordinate system, where this difference is minimal, and the resulting estimate of the convergence rate subsequently acts as a standard. The paper studies the convergence rate of the relaxation subgradient method (RSM) with optimization of the parameters of two-rank correction of metric matrices on smooth strongly convex functions with a Lipschitz gradient without assumptions about the existence of second derivatives of the function. The considered RSM is similar in structure to quasi-Newton minimization methods. Unlike the latter, its metric matrix is not an approximation of the inverse matrix of second derivatives but is adjusted in such a way that it enables one to find the descent direction that takes the method beyond a certain neighborhood of the current minimum as a result of one-dimensional minimization along it. This means that the metric matrix enables one to turn the current gradient into a direction that is gradient-consistent with the set of gradients of some neighborhood of the current minimum. Under broad assumptions on the parameters of transformations of metric matrices, an estimate of the convergence rate of the studied RSM and an estimate of its ability to exclude removable linear background are obtained. The obtained estimates turn out to be qualitatively similar to estimates for Newton’s method. In this case, the assumption of the existence of second derivatives of the function is not required. A computational experiment was carried out in which the quasi-Newton BFGS method and the subgradient method under study were compared on various types of smooth functions. The testing results indicate the effectiveness of the subgradient method in minimizing smooth functions with a high degree of conditionality of the problem and its ability to eliminate the linear background that worsens the convergence.

Suggested Citation

  • Elena Tovbis & Vladimir Krutikov & Lev Kazakovtsev, 2024. "Newtonian Property of Subgradient Method with Optimization of Metric Matrix Parameter Correction," Mathematics, MDPI, vol. 12(11), pages 1-27, May.
  • Handle: RePEc:gam:jmathe:v:12:y:2024:i:11:p:1618-:d:1399091
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/12/11/1618/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/12/11/1618/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Anton Rodomanov & Yurii Nesterov, 2021. "New Results on Superlinear Convergence of Classical Quasi-Newton Methods," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 744-769, March.
    2. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    2. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    3. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    4. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    5. Meruza Kubentayeva & Demyan Yarmoshik & Mikhail Persiianov & Alexey Kroshnin & Ekaterina Kotliarova & Nazarii Tupitsa & Dmitry Pasechnyuk & Alexander Gasnikov & Vladimir Shvetsov & Leonid Baryshev & A, 2024. "Primal-dual gradient methods for searching network equilibria in combined models with nested choice structure and capacity constraints," Computational Management Science, Springer, vol. 21(1), pages 1-33, June.
    6. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    7. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    8. Benjamin Grimmer, 2023. "General Hölder Smooth Convergence Rates Follow from Specialized Rates Assuming Growth Bounds," Journal of Optimization Theory and Applications, Springer, vol. 197(1), pages 51-70, April.
    9. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
    10. Chin Pang Ho & Panos Parpas, 2019. "Empirical risk minimization: probabilistic complexity and stepsize strategy," Computational Optimization and Applications, Springer, vol. 73(2), pages 387-410, June.
    11. Anton Rodomanov & Yurii Nesterov, 2020. "Smoothness Parameter of Power of Euclidean Norm," Journal of Optimization Theory and Applications, Springer, vol. 185(2), pages 303-326, May.
    12. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
    13. Zhen-Yuan Ji & Yu-Hong Dai, 2023. "Greedy PSB methods with explicit superlinear convergence," Computational Optimization and Applications, Springer, vol. 85(3), pages 753-786, July.
    14. Hu, Yaohua & Li, Gongnong & Yu, Carisa Kwok Wai & Yip, Tsz Leung, 2022. "Quasi-convex feasibility problems: Subgradient methods and convergence rates," European Journal of Operational Research, Elsevier, vol. 298(1), pages 45-58.
    15. Meruza Kubentayeva & Alexander Gasnikov, 2021. "Finding Equilibria in the Traffic Assignment Problem with Primal-Dual Gradient Methods for Stable Dynamics Model and Beckmann Model," Mathematics, MDPI, vol. 9(11), pages 1-17, May.
    16. Vladimir Krutikov & Elena Tovbis & Predrag Stanimirović & Lev Kazakovtsev, 2023. "On the Convergence Rate of Quasi-Newton Methods on Strongly Convex Functions with Lipschitz Gradient," Mathematics, MDPI, vol. 11(23), pages 1-15, November.
    17. Masaru Ito & Mituhiro Fukuda, 2021. "Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 770-804, March.
    18. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
    19. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
    20. Stefania Bellavia & Gianmarco Gurioli & Benedetta Morini & Philippe Louis Toint, 2023. "The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case," Journal of Optimization Theory and Applications, Springer, vol. 196(2), pages 700-729, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:11:p:1618-:d:1399091. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.