IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v153y2012i3d10.1007_s10957-011-9960-x.html
   My bibliography  Save this article

Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization

Author

Listed:
  • Kaori Sugiki

    (Mizuho Information & Research Institute, Inc.)

  • Yasushi Narushima

    (Fukushima National College of Technology)

  • Hiroshi Yabe

    (Tokyo University of Science)

Abstract

In this paper, we propose a three-term conjugate gradient method based on secant conditions for unconstrained optimization problems. Specifically, we apply the idea of Dai and Liao (in Appl. Math. Optim. 43: 87–101, 2001) to the three-term conjugate gradient method proposed by Narushima et al. (in SIAM J. Optim. 21: 212–230, 2011). Moreover, we derive a special-purpose three-term conjugate gradient method for a problem, whose objective function has a special structure, and apply it to nonlinear least squares problems. We prove the global convergence properties of the proposed methods. Finally, some numerical results are given to show the performance of our methods.

Suggested Citation

  • Kaori Sugiki & Yasushi Narushima & Hiroshi Yabe, 2012. "Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 153(3), pages 733-757, June.
  • Handle: RePEc:spr:joptap:v:153:y:2012:i:3:d:10.1007_s10957-011-9960-x
    DOI: 10.1007/s10957-011-9960-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-011-9960-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-011-9960-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. J. Z. Zhang & N. Y. Deng & L. H. Chen, 1999. "New Quasi-Newton Equation and Related Methods for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 102(1), pages 147-167, July.
    2. Avinoam Perry, 1978. "Technical Note—A Modified Conjugate Gradient Algorithm," Operations Research, INFORMS, vol. 26(6), pages 1073-1078, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. XiaoLiang Dong & Deren Han & Zhifeng Dai & Lixiang Li & Jianguang Zhu, 2018. "An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 944-961, December.
    2. Qi Tian & Xiaoliang Wang & Liping Pang & Mingkun Zhang & Fanyun Meng, 2021. "A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems," Mathematics, MDPI, vol. 9(12), pages 1-13, June.
    3. Bakhtawar Baluch & Zabidin Salleh & Ahmad Alhawarat & U. A. M. Roslan, 2017. "A New Modified Three-Term Conjugate Gradient Method with Sufficient Descent Property and Its Global Convergence," Journal of Mathematics, Hindawi, vol. 2017, pages 1-12, September.
    4. Yasushi Narushima & Shummin Nakayama & Masashi Takemura & Hiroshi Yabe, 2023. "Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 639-664, May.
    5. Auwal Bala Abubakar & Poom Kumam & Aliyu Muhammed Awwal & Phatiphat Thounthong, 2019. "A Modified Self-Adaptive Conjugate Gradient Method for Solving Convex Constrained Monotone Nonlinear Equations for Signal Recovery Problems," Mathematics, MDPI, vol. 7(8), pages 1-24, August.
    6. Mehiddin Al-Baali & Yasushi Narushima & Hiroshi Yabe, 2015. "A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization," Computational Optimization and Applications, Springer, vol. 60(1), pages 89-110, January.
    7. Dong, Xiao Liang & Liu, Hong Wei & He, Yu Bo, 2015. "New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction," Applied Mathematics and Computation, Elsevier, vol. 269(C), pages 606-617.
    8. Mina Torabi & Mohammad-Mehdi Hosseini, 2018. "A New Descent Algorithm Using the Three-Step Discretization Method for Solving Unconstrained Optimization Problems," Mathematics, MDPI, vol. 6(4), pages 1-18, April.
    9. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    10. Saman Babaie-Kafaki & Reza Ghanbari, 2016. "Descent Symmetrization of the Dai–Liao Conjugate Gradient Method," Asia-Pacific Journal of Operational Research (APJOR), World Scientific Publishing Co. Pte. Ltd., vol. 33(02), pages 1-10, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waziri, Mohammed Yusuf & Ahmed, Kabiru & Sabi’u, Jamilu, 2019. "A family of Hager–Zhang conjugate gradient methods for system of monotone nonlinear equations," Applied Mathematics and Computation, Elsevier, vol. 361(C), pages 645-660.
    2. Fahimeh Biglari & Farideh Mahmoodpur, 2016. "Scaling Damped Limited-Memory Updates for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 177-188, July.
    3. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    4. Bassim A. Hassan & Issam A. R. Moghrabi & Thaair A. Ameen & Ranen M. Sulaiman & Ibrahim Mohammed Sulaiman, 2024. "Image Noise Reduction and Solution of Unconstrained Minimization Problems via New Conjugate Gradient Methods," Mathematics, MDPI, vol. 12(17), pages 1-12, September.
    5. Nataj, Sarah & Lui, S.H., 2020. "Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point," Applied Mathematics and Computation, Elsevier, vol. 369(C).
    6. Yao, Shengwei & Lu, Xiwen & Ning, Liangshuo & Li, Feifei, 2015. "A class of one parameter conjugate gradient methods," Applied Mathematics and Computation, Elsevier, vol. 265(C), pages 708-722.
    7. Yu, Yang & Wang, Yu & Deng, Rui & Yin, Yu, 2023. "New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 208(C), pages 677-701.
    8. Hassan Mohammad & Mohammed Yusuf Waziri, 2019. "Structured Two-Point Stepsize Gradient Methods for Nonlinear Least Squares," Journal of Optimization Theory and Applications, Springer, vol. 181(1), pages 298-317, April.
    9. Fahimeh Biglari & Maghsud Solimanpur, 2013. "Scaling on the Spectral Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 158(2), pages 626-635, August.
    10. S. Bojari & M. R. Eslahchi, 2020. "Global convergence of a family of modified BFGS methods under a modified weak-Wolfe–Powell line search for nonconvex functions," 4OR, Springer, vol. 18(2), pages 219-244, June.
    11. Yong Li & Gonglin Yuan & Zhou Sheng, 2018. "An active-set algorithm for solving large-scale nonsmooth optimization models with box constraints," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-16, January.
    12. Neculai Andrei, 2013. "Another Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions for Large-scale Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 159(1), pages 159-182, October.
    13. Gonglin Yuan & Xiaoliang Wang & Zhou Sheng, 2020. "The Projection Technique for Two Open Problems of Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 186(2), pages 590-619, August.
    14. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    15. Jinbao Jian & Lin Yang & Xianzhen Jiang & Pengjie Liu & Meixing Liu, 2020. "A Spectral Conjugate Gradient Method with Descent Property," Mathematics, MDPI, vol. 8(2), pages 1-13, February.
    16. Mehiddin Al-Baali & Humaid Khalfan, 2012. "A combined class of self-scaling and modified quasi-Newton methods," Computational Optimization and Applications, Springer, vol. 52(2), pages 393-408, June.
    17. Parvaneh Faramarzi & Keyvan Amini, 2019. "A Modified Spectral Conjugate Gradient Method with Global Convergence," Journal of Optimization Theory and Applications, Springer, vol. 182(2), pages 667-690, August.
    18. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    19. D. Tarzanagh & M. Peyghami, 2015. "A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems," Journal of Global Optimization, Springer, vol. 63(4), pages 709-728, December.
    20. Andrei, Neculai, 2010. "Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization," European Journal of Operational Research, Elsevier, vol. 204(3), pages 410-420, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:153:y:2012:i:3:d:10.1007_s10957-011-9960-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.