IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v81y2022i1d10.1007_s10589-021-00333-z.html
   My bibliography  Save this article

On R-linear convergence analysis for a class of gradient methods

Author

Listed:
  • Na Huang

    (China Agricultural University)

Abstract

Gradient method is a simple optimization approach using minus gradient of the objective function as a search direction. Its efficiency highly relies on the choices of the stepsize. In this paper, the convergence behavior of a class of gradient methods, where the stepsize has an important property introduced in (Dai in Optimization 52:395–415, 2003), is analyzed. Our analysis is focused on minimization on strictly convex quadratic functions. We establish the R-linear convergence and derive an estimate for the R-factor. Specifically, if the stepsize can be expressed as a collection of Rayleigh quotient of the inverse Hessian matrix, we are able to show that these methods converge R-linearly and their R-factors are bounded above by $$1-\frac{1}{\varkappa }$$ 1 - 1 ϰ , where $$\varkappa$$ ϰ is the associated condition number. Preliminary numerical results demonstrate the tightness of our estimate of the R-factor.

Suggested Citation

  • Na Huang, 2022. "On R-linear convergence analysis for a class of gradient methods," Computational Optimization and Applications, Springer, vol. 81(1), pages 161-177, January.
  • Handle: RePEc:spr:coopap:v:81:y:2022:i:1:d:10.1007_s10589-021-00333-z
    DOI: 10.1007/s10589-021-00333-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-021-00333-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-021-00333-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Roberta De Asmundis & Daniela di Serafino & William Hager & Gerardo Toraldo & Hongchao Zhang, 2014. "An efficient gradient method using the Yuan steplength," Computational Optimization and Applications, Springer, vol. 59(3), pages 541-563, December.
    2. Yu-Hong Dai & Yakui Huang & Xin-Wei Liu, 2019. "A family of spectral gradient methods for optimization," Computational Optimization and Applications, Springer, vol. 74(1), pages 43-65, September.
    3. Wenyu Sun & Ya-Xiang Yuan, 2006. "Optimization Theory and Methods," Springer Optimization and Its Applications, Springer, number 978-0-387-24976-6, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Roberto Andreani & Marcos Raydan, 2021. "Properties of the delayed weighted gradient method," Computational Optimization and Applications, Springer, vol. 78(1), pages 167-180, January.
    2. Yakui Huang & Yu-Hong Dai & Xin-Wei Liu & Hongchao Zhang, 2022. "On the acceleration of the Barzilai–Borwein method," Computational Optimization and Applications, Springer, vol. 81(3), pages 717-740, April.
    3. Masoud Fatemi, 2022. "On initial point selection of the steepest descent algorithm for general quadratic functions," Computational Optimization and Applications, Springer, vol. 82(2), pages 329-360, June.
    4. Yasushi Narushima & Shummin Nakayama & Masashi Takemura & Hiroshi Yabe, 2023. "Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 639-664, May.
    5. Saha, Tanay & Rakshit, Suman & Khare, Swanand R., 2023. "Linearly structured quadratic model updating using partial incomplete eigendata," Applied Mathematics and Computation, Elsevier, vol. 446(C).
    6. Guang Li & Paat Rusmevichientong & Huseyin Topaloglu, 2015. "The d -Level Nested Logit Model: Assortment and Price Optimization Problems," Operations Research, INFORMS, vol. 63(2), pages 325-342, April.
    7. Zheng, Sanpeng & Feng, Renzhong, 2023. "A variable projection method for the general radial basis function neural network," Applied Mathematics and Computation, Elsevier, vol. 451(C).
    8. Jörg Fliege & Andrey Tin & Alain Zemkoho, 2021. "Gauss–Newton-type methods for bilevel optimization," Computational Optimization and Applications, Springer, vol. 78(3), pages 793-824, April.
    9. Hai-Jun Wang & Qin Ni, 2010. "A Convex Approximation Method For Large Scale Linear Inequality Constrained Minimization," Asia-Pacific Journal of Operational Research (APJOR), World Scientific Publishing Co. Pte. Ltd., vol. 27(01), pages 85-101.
    10. Chen, Liang, 2016. "A high-order modified Levenberg–Marquardt method for systems of nonlinear equations with fourth-order convergence," Applied Mathematics and Computation, Elsevier, vol. 285(C), pages 79-93.
    11. Ji, Li-Qun, 2015. "An assessment of agricultural residue resources for liquid biofuel production in China," Renewable and Sustainable Energy Reviews, Elsevier, vol. 44(C), pages 561-575.
    12. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    13. Marko Miladinović & Predrag Stanimirović & Sladjana Miljković, 2011. "Scalar Correction Method for Solving Large Scale Unconstrained Minimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 151(2), pages 304-320, November.
    14. Wei Bian & Xiaojun Chen, 2017. "Optimality and Complexity for Constrained Optimization Problems with Nonconvex Regularization," Mathematics of Operations Research, INFORMS, vol. 42(4), pages 1063-1084, November.
    15. Yutao Zheng & Bing Zheng, 2017. "Two New Dai–Liao-Type Conjugate Gradient Methods for Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 175(2), pages 502-509, November.
    16. Xiaojing Zhu & Hiroyuki Sato, 2020. "Riemannian conjugate gradient methods with inverse retraction," Computational Optimization and Applications, Springer, vol. 77(3), pages 779-810, December.
    17. Bonettini, Silvia & Prato, Marco & Rebegoldi, Simone, 2016. "A cyclic block coordinate descent method with generalized gradient projections," Applied Mathematics and Computation, Elsevier, vol. 286(C), pages 288-300.
    18. Li, Jinqing & Ma, Jun, 2019. "Maximum penalized likelihood estimation of additive hazards models with partly interval censoring," Computational Statistics & Data Analysis, Elsevier, vol. 137(C), pages 170-180.
    19. Serena Crisci & Federica Porta & Valeria Ruggiero & Luca Zanni, 2023. "Hybrid limited memory gradient projection methods for box-constrained optimization problems," Computational Optimization and Applications, Springer, vol. 84(1), pages 151-189, January.
    20. Zohre Aminifard & Saman Babaie-Kafaki, 2019. "An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix," 4OR, Springer, vol. 17(3), pages 317-330, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:81:y:2022:i:1:d:10.1007_s10589-021-00333-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.