IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v75y2020i1d10.1007_s10589-019-00143-4.html
   My bibliography  Save this article

An improved Dai–Kou conjugate gradient algorithm for unconstrained optimization

Author

Listed:
  • Zexian Liu

    (Xidian University
    Chinese Academy of Sciences)

  • Hongwei Liu

    (Xidian University)

  • Yu-Hong Dai

    (Chinese Academy of Sciences)

Abstract

It is gradually accepted that the loss of orthogonality of the gradients in a conjugate gradient algorithm may decelerate the convergence rate to some extent. The Dai–Kou conjugate gradient algorithm (SIAM J Optim 23(1):296–320, 2013), called CGOPT, has attracted many researchers’ attentions due to its numerical efficiency. In this paper, we present an improved Dai–Kou conjugate gradient algorithm for unconstrained optimization, which only consists of two kinds of iterations. In the improved Dai–Kou conjugate gradient algorithm, we develop a new quasi-Newton method to improve the orthogonality by solving the subproblem in the subspace and design a modified strategy for the choice of the initial stepsize for improving the numerical performance. The global convergence of the improved Dai–Kou conjugate gradient algorithm is established without the strict assumptions in the convergence analysis of other limited memory conjugate gradient methods. Some numerical results suggest that the improved Dai–Kou conjugate gradient algorithm (CGOPT (2.0)) yields a tremendous improvement over the original Dai–Kou CG algorithm (CGOPT (1.0)) and is slightly superior to the latest limited memory conjugate gradient software package CG$$\_ $$_DESCENT (6.8) developed by Hager and Zhang (SIAM J Optim 23(4):2150–2168, 2013) for the CUTEr library.

Suggested Citation

  • Zexian Liu & Hongwei Liu & Yu-Hong Dai, 2020. "An improved Dai–Kou conjugate gradient algorithm for unconstrained optimization," Computational Optimization and Applications, Springer, vol. 75(1), pages 145-167, January.
  • Handle: RePEc:spr:coopap:v:75:y:2020:i:1:d:10.1007_s10589-019-00143-4
    DOI: 10.1007/s10589-019-00143-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-019-00143-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-019-00143-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Avinoam Perry, 1977. "A Class of Conjugate Gradient Algorithms with a Two-Step Variable Metric Memory," Discussion Papers 269, Northwestern University, Center for Mathematical Studies in Economics and Management Science.
    2. D. Tarzanagh & M. Peyghami, 2015. "A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems," Journal of Global Optimization, Springer, vol. 63(4), pages 709-728, December.
    3. Zexian Liu & Hongwei Liu, 2019. "An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 608-633, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Abubakar, Auwal Bala & Kumam, Poom & Malik, Maulana & Ibrahim, Abdulkarim Hassan, 2022. "A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 201(C), pages 640-657.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. M. Fatemi, 2016. "An Optimal Parameter for Dai–Liao Family of Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 169(2), pages 587-605, May.
    2. S. Bojari & M. R. Eslahchi, 2020. "Global convergence of a family of modified BFGS methods under a modified weak-Wolfe–Powell line search for nonconvex functions," 4OR, Springer, vol. 18(2), pages 219-244, June.
    3. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    4. Andrei, Neculai, 2010. "Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization," European Journal of Operational Research, Elsevier, vol. 204(3), pages 410-420, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:75:y:2020:i:1:d:10.1007_s10589-019-00143-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.