IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v8y2020i2p280-d322557.html
   My bibliography  Save this article

A Spectral Conjugate Gradient Method with Descent Property

Author

Listed:
  • Jinbao Jian

    (College of Science, Guangxi University for Nationalities, Nanning 530006, Guangxi, China)

  • Lin Yang

    (College of Science, Guangxi University for Nationalities, Nanning 530006, Guangxi, China)

  • Xianzhen Jiang

    (College of Science, Guangxi University for Nationalities, Nanning 530006, Guangxi, China)

  • Pengjie Liu

    (College of Mathematics and Information Science, Guangxi University, Nanning 530004, Guangxi, China)

  • Meixing Liu

    (Guangxi Colleges and Universities Key Laboratory of Complex System Optimization and Big Data Processing, Yulin Normal University, Yulin 537000, Guangxi, China)

Abstract

Spectral conjugate gradient method (SCGM) is an important generalization of the conjugate gradient method (CGM), and it is also one of the effective numerical methods for large-scale unconstrained optimization. The designing for the spectral parameter and the conjugate parameter in SCGM is a core work. And the aim of this paper is to propose a new and effective alternative method for these two parameters. First, motivated by the strong Wolfe line search requirement, we design a new spectral parameter. Second, we propose a hybrid conjugate parameter. Such a way for yielding the two parameters can ensure that the search directions always possess descent property without depending on any line search rule. As a result, a new SCGM with the standard Wolfe line search is proposed. Under usual assumptions, the global convergence of the proposed SCGM is proved. Finally, by testing 108 test instances from 2 to 1,000,000 dimensions in the CUTE library and other classic test collections, a large number of numerical experiments, comparing with both SCGMs and CGMs, for the presented SCGM are executed. The detail results and their corresponding performance profiles are reported, which show that the proposed SCGM is effective and promising.

Suggested Citation

  • Jinbao Jian & Lin Yang & Xianzhen Jiang & Pengjie Liu & Meixing Liu, 2020. "A Spectral Conjugate Gradient Method with Descent Property," Mathematics, MDPI, vol. 8(2), pages 1-13, February.
  • Handle: RePEc:gam:jmathe:v:8:y:2020:i:2:p:280-:d:322557
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/8/2/280/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/8/2/280/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    2. Avinoam Perry, 1978. "Technical Note—A Modified Conjugate Gradient Algorithm," Operations Research, INFORMS, vol. 26(6), pages 1073-1078, December.
    3. N. Andrei, 2009. "Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 141(2), pages 249-264, May.
    4. Y.H. Dai & Y. Yuan, 2001. "An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization," Annals of Operations Research, Springer, vol. 103(1), pages 33-47, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Asier Zulueta & Decebal Aitor Ispas-Gil & Ekaitz Zulueta & Joseba Garcia-Ortega & Unai Fernandez-Gamiz, 2022. "Battery Sizing Optimization in Power Smoothing Applications," Energies, MDPI, vol. 15(3), pages 1-20, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Neculai Andrei, 2013. "Another Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions for Large-scale Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 159(1), pages 159-182, October.
    2. Saman Babaie-Kafaki, 2012. "A Quadratic Hybridization of Polak–Ribière–Polyak and Fletcher–Reeves Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 154(3), pages 916-932, September.
    3. Parvaneh Faramarzi & Keyvan Amini, 2021. "A spectral three-term Hestenes–Stiefel conjugate gradient method," 4OR, Springer, vol. 19(1), pages 71-92, March.
    4. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    5. Kaori Sugiki & Yasushi Narushima & Hiroshi Yabe, 2012. "Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 153(3), pages 733-757, June.
    6. Hiroyuki Sakai & Hideaki Iiduka, 2020. "Hybrid Riemannian conjugate gradient methods with global convergence properties," Computational Optimization and Applications, Springer, vol. 77(3), pages 811-830, December.
    7. Bassim A. Hassan & Issam A. R. Moghrabi & Thaair A. Ameen & Ranen M. Sulaiman & Ibrahim Mohammed Sulaiman, 2024. "Image Noise Reduction and Solution of Unconstrained Minimization Problems via New Conjugate Gradient Methods," Mathematics, MDPI, vol. 12(17), pages 1-12, September.
    8. Nataj, Sarah & Lui, S.H., 2020. "Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point," Applied Mathematics and Computation, Elsevier, vol. 369(C).
    9. Yao, Shengwei & Lu, Xiwen & Ning, Liangshuo & Li, Feifei, 2015. "A class of one parameter conjugate gradient methods," Applied Mathematics and Computation, Elsevier, vol. 265(C), pages 708-722.
    10. Serge Gratton & Vincent Malmedy & Philippe Toint, 2012. "Using approximate secant equations in limited memory methods for multilevel unconstrained optimization," Computational Optimization and Applications, Springer, vol. 51(3), pages 967-979, April.
    11. Kin Keung Lai & Shashi Kant Mishra & Bhagwat Ram & Ravina Sharma, 2023. "A Conjugate Gradient Method: Quantum Spectral Polak–Ribiére–Polyak Approach for Unconstrained Optimization Problems," Mathematics, MDPI, vol. 11(23), pages 1-14, December.
    12. Priester, C. Robert & Melbourne-Thomas, Jessica & Klocker, Andreas & Corney, Stuart, 2017. "Abrupt transitions in dynamics of a NPZD model across Southern Ocean fronts," Ecological Modelling, Elsevier, vol. 359(C), pages 372-382.
    13. Yu, Yang & Wang, Yu & Deng, Rui & Yin, Yu, 2023. "New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 208(C), pages 677-701.
    14. Ahmad M. Alshamrani & Adel Fahad Alrasheedi & Khalid Abdulaziz Alnowibet & Salem Mahdi & Ali Wagdy Mohamed, 2022. "A Hybrid Stochastic Deterministic Algorithm for Solving Unconstrained Optimization Problems," Mathematics, MDPI, vol. 10(17), pages 1-26, August.
    15. S. Bojari & M. R. Eslahchi, 2020. "Global convergence of a family of modified BFGS methods under a modified weak-Wolfe–Powell line search for nonconvex functions," 4OR, Springer, vol. 18(2), pages 219-244, June.
    16. B. Sellami & Y. Chaib, 2016. "A new family of globally convergent conjugate gradient methods," Annals of Operations Research, Springer, vol. 241(1), pages 497-513, June.
    17. Zhifeng Dai, 2017. "Comments on Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 175(1), pages 286-291, October.
    18. N. Andrei, 2009. "Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 141(2), pages 249-264, May.
    19. Hiroyuki Sakai & Hideaki Iiduka, 2021. "Sufficient Descent Riemannian Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 130-150, July.
    20. Jose Giovany Babativa-Márquez & José Luis Vicente-Villardón, 2021. "Logistic Biplot by Conjugate Gradient Algorithms and Iterated SVD," Mathematics, MDPI, vol. 9(16), pages 1-19, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:8:y:2020:i:2:p:280-:d:322557. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.