IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v178y2018i1d10.1007_s10957-018-1288-3.html
   My bibliography  Save this article

A Double-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Method Based on Minimizing the Measure Function of Byrd and Nocedal for Unconstrained Optimization

Author

Listed:
  • Neculai Andrei

    (Center for Advanced Modeling and Optimization
    Academy of Romanian Scientists)

Abstract

In this paper, the first two terms on the right-hand side of the Broyden–Fletcher–Goldfarb–Shanno update are scaled with a positive parameter, while the third one is also scaled with another positive parameter. These scaling parameters are determined by minimizing the measure function introduced by Byrd and Nocedal (SIAM J Numer Anal 26:727–739, 1989). The obtained algorithm is close to the algorithm based on clustering the eigenvalues of the Broyden–Fletcher–Goldfarb–Shanno approximation of the Hessian and on shifting its large eigenvalues to the left, but it is not superior to it. Under classical assumptions, the convergence is proved by using the trace and the determinant of the iteration matrix. By using a set of 80 unconstrained optimization test problems, it is proved that the algorithm minimizing the measure function of Byrd and Nocedal is more efficient and more robust than some other scaling Broyden–Fletcher–Goldfarb–Shanno algorithms, including the variants of Biggs (J Inst Math Appl 12:337–338, 1973), Yuan (IMA J Numer Anal 11:325–332, 1991), Oren and Luenberger (Manag Sci 20:845–862, 1974) and of Nocedal and Yuan (Math Program 61:19–37, 1993). However, it is less efficient than the algorithms based on clustering the eigenvalues of the iteration matrix and on shifting its large eigenvalues to the left, as shown by Andrei (J Comput Appl Math 332:26–44, 2018, Numer Algorithms 77:413–432, 2018).

Suggested Citation

  • Neculai Andrei, 2018. "A Double-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Method Based on Minimizing the Measure Function of Byrd and Nocedal for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 178(1), pages 191-218, July.
  • Handle: RePEc:spr:joptap:v:178:y:2018:i:1:d:10.1007_s10957-018-1288-3
    DOI: 10.1007/s10957-018-1288-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-018-1288-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-018-1288-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. J. Z. Zhang & N. Y. Deng & L. H. Chen, 1999. "New Quasi-Newton Equation and Related Methods for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 102(1), pages 147-167, July.
    2. Gonglin Yuan & Zengxin Wei, 2010. "Convergence analysis of a modified BFGS method on convex minimizations," Computational Optimization and Applications, Springer, vol. 47(2), pages 237-255, October.
    3. Saman Babaie-Kafaki, 2015. "On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae," Journal of Optimization Theory and Applications, Springer, vol. 167(1), pages 91-101, October.
    4. W. Y. Cheng & D. H. Li, 2010. "Spectral Scaling BFGS Method," Journal of Optimization Theory and Applications, Springer, vol. 146(2), pages 305-319, August.
    5. Neculai Andrei, 2017. "Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology," Springer Optimization and Its Applications, Springer, number 978-3-319-58356-3, June.
    6. Neculai Andrei, 2016. "A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization," Springer Optimization and Its Applications, in: Boris Goldengorin (ed.), Optimization and Its Applications in Control and Data Sciences, pages 1-16, Springer.
    7. Neculai Andrei, 2017. "Applications of Continuous Nonlinear Optimization," Springer Optimization and Its Applications, in: Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology, chapter 0, pages 47-117, Springer.
    8. M. Al-Baali, 1998. "Numerical Experience with a Class of Self-Scaling Quasi-Newton Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 96(3), pages 533-553, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. S. Cipolla & C. Di Fiore & P. Zellini, 2020. "A variation of Broyden class methods using Householder adaptive transforms," Computational Optimization and Applications, Springer, vol. 77(2), pages 433-463, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. S. Bojari & M. R. Eslahchi, 2020. "Global convergence of a family of modified BFGS methods under a modified weak-Wolfe–Powell line search for nonconvex functions," 4OR, Springer, vol. 18(2), pages 219-244, June.
    2. Yong Li & Gonglin Yuan & Zhou Sheng, 2018. "An active-set algorithm for solving large-scale nonsmooth optimization models with box constraints," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-16, January.
    3. Gonglin Yuan & Xiaoliang Wang & Zhou Sheng, 2020. "The Projection Technique for Two Open Problems of Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 186(2), pages 590-619, August.
    4. Martin Ćalasan & Tatjana Konjić & Katarina Kecojević & Lazar Nikitović, 2020. "Optimal Allocation of Static Var Compensators in Electric Power Systems," Energies, MDPI, vol. 13(12), pages 1-24, June.
    5. Neculai Andrei, 2020. "Diagonal Approximation of the Hessian by Finite Differences for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 185(3), pages 859-879, June.
    6. Qiwei Yang & Yantai Huang & Qiangqiang Zhang & Jinjiang Zhang, 2023. "A Bi-Level Optimization and Scheduling Strategy for Charging Stations Considering Battery Degradation," Energies, MDPI, vol. 16(13), pages 1-15, June.
    7. Zexian Liu & Hongwei Liu, 2019. "An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 608-633, May.
    8. Yasushi Narushima & Shummin Nakayama & Masashi Takemura & Hiroshi Yabe, 2023. "Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 639-664, May.
    9. Kaori Sugiki & Yasushi Narushima & Hiroshi Yabe, 2012. "Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 153(3), pages 733-757, June.
    10. Fahimeh Biglari & Farideh Mahmoodpur, 2016. "Scaling Damped Limited-Memory Updates for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 177-188, July.
    11. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    12. W. Y. Cheng & D. H. Li, 2010. "Spectral Scaling BFGS Method," Journal of Optimization Theory and Applications, Springer, vol. 146(2), pages 305-319, August.
    13. Yu, Yang & Wang, Yu & Deng, Rui & Yin, Yu, 2023. "New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 208(C), pages 677-701.
    14. Hassan Mohammad & Mohammed Yusuf Waziri, 2019. "Structured Two-Point Stepsize Gradient Methods for Nonlinear Least Squares," Journal of Optimization Theory and Applications, Springer, vol. 181(1), pages 298-317, April.
    15. Fahimeh Biglari & Maghsud Solimanpur, 2013. "Scaling on the Spectral Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 158(2), pages 626-635, August.
    16. Qi Tian & Xiaoliang Wang & Liping Pang & Mingkun Zhang & Fanyun Meng, 2021. "A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems," Mathematics, MDPI, vol. 9(12), pages 1-13, June.
    17. Zohre Aminifard & Saman Babaie-Kafaki, 2019. "An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix," 4OR, Springer, vol. 17(3), pages 317-330, September.
    18. Gonglin Yuan & Zhou Sheng & Wenjie Liu, 2016. "The Modified HZ Conjugate Gradient Algorithm for Large-Scale Nonsmooth Optimization," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-15, October.
    19. XiaoLiang Dong & Deren Han & Zhifeng Dai & Lixiang Li & Jianguang Zhu, 2018. "An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 944-961, December.
    20. Shummin Nakayama & Yasushi Narushima & Hiroshi Yabe, 2021. "Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 79(1), pages 127-154, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:178:y:2018:i:1:d:10.1007_s10957-018-1288-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.