IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v89y2024i1d10.1007_s10589-024-00586-4.html
   My bibliography  Save this article

Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization

Author

Listed:
  • Hugo Lara

    (Federal University of Santa Catarina)

  • Rafael Aleixo

    (Federal University of Santa Catarina)

  • Harry Oviedo

    (Universidad Adolfo Ibáñez)

Abstract

The Delayed Weighted Gradient Method (DWGM) is a two-step gradient algorithm that is efficient for the minimization of large scale strictly convex quadratic functions. It has orthogonality properties that make it to compete with the Conjugate Gradient (CG) method. Both methods calculate in sequence two step-sizes, CG minimizes the objective function and DWGM the gradient norm, alongside two search directions defined over first order current and previous iteration information. The objective of this work is to accelerate the recently developed extension of DWGM to nonquadratic strongly convex minimization problems. Our idea is to define the step-sizes of DWGM in a unique two dimensional convex quadratic optimization problem, calculating them simultaneously. Convergence of the resulting algorithm is analyzed. Comparative numerical experiments illustrate the effectiveness of our approach.

Suggested Citation

  • Hugo Lara & Rafael Aleixo & Harry Oviedo, 2024. "Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization," Computational Optimization and Applications, Springer, vol. 89(1), pages 151-182, September.
  • Handle: RePEc:spr:coopap:v:89:y:2024:i:1:d:10.1007_s10589-024-00586-4
    DOI: 10.1007/s10589-024-00586-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-024-00586-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-024-00586-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yakui Huang & Yu-Hong Dai & Xin-Wei Liu & Hongchao Zhang, 2022. "On the acceleration of the Barzilai–Borwein method," Computational Optimization and Applications, Springer, vol. 81(3), pages 717-740, April.
    2. Roberta De Asmundis & Daniela di Serafino & William Hager & Gerardo Toraldo & Hongchao Zhang, 2014. "An efficient gradient method using the Yuan steplength," Computational Optimization and Applications, Springer, vol. 59(3), pages 541-563, December.
    3. Harry Fernando Oviedo Leon, 2019. "A delayed weighted gradient method for strictly convex quadratic minimization," Computational Optimization and Applications, Springer, vol. 74(3), pages 729-746, December.
    4. Giulia Ferrandi & Michiel E. Hochstenbach & Nataša Krejić, 2023. "A harmonic framework for stepsize selection in gradient methods," Computational Optimization and Applications, Springer, vol. 85(1), pages 75-106, May.
    5. Roberto Andreani & Marcos Raydan, 2021. "Properties of the delayed weighted gradient method," Computational Optimization and Applications, Springer, vol. 78(1), pages 167-180, January.
    6. di Serafino, Daniela & Ruggiero, Valeria & Toraldo, Gerardo & Zanni, Luca, 2018. "On the steplength selection in gradient methods for unconstrained optimization," Applied Mathematics and Computation, Elsevier, vol. 318(C), pages 176-195.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Roberto Andreani & Marcos Raydan, 2021. "Properties of the delayed weighted gradient method," Computational Optimization and Applications, Springer, vol. 78(1), pages 167-180, January.
    2. Serena Crisci & Federica Porta & Valeria Ruggiero & Luca Zanni, 2023. "Hybrid limited memory gradient projection methods for box-constrained optimization problems," Computational Optimization and Applications, Springer, vol. 84(1), pages 151-189, January.
    3. Corsaro, Stefania & De Simone, Valentina & Marino, Zelda, 2021. "Split Bregman iteration for multi-period mean variance portfolio optimization," Applied Mathematics and Computation, Elsevier, vol. 392(C).
    4. E. Loli Piccolomini & V. L. Coli & E. Morotti & L. Zanni, 2018. "Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm," Computational Optimization and Applications, Springer, vol. 71(1), pages 171-191, September.
    5. Harry Fernando Oviedo Leon, 2019. "A delayed weighted gradient method for strictly convex quadratic minimization," Computational Optimization and Applications, Springer, vol. 74(3), pages 729-746, December.
    6. Behzad Azmi & Karl Kunisch, 2020. "Analysis of the Barzilai-Borwein Step-Sizes for Problems in Hilbert Spaces," Journal of Optimization Theory and Applications, Springer, vol. 185(3), pages 819-844, June.
    7. Giulia Ferrandi & Michiel E. Hochstenbach & Nataša Krejić, 2023. "A harmonic framework for stepsize selection in gradient methods," Computational Optimization and Applications, Springer, vol. 85(1), pages 75-106, May.
    8. Yakui Huang & Yu-Hong Dai & Xin-Wei Liu & Hongchao Zhang, 2022. "On the acceleration of the Barzilai–Borwein method," Computational Optimization and Applications, Springer, vol. 81(3), pages 717-740, April.
    9. Crisci, Serena & Ruggiero, Valeria & Zanni, Luca, 2019. "Steplength selection in gradient projection methods for box-constrained quadratic programs," Applied Mathematics and Computation, Elsevier, vol. 356(C), pages 312-327.
    10. Yu-Hong Dai & Yakui Huang & Xin-Wei Liu, 2019. "A family of spectral gradient methods for optimization," Computational Optimization and Applications, Springer, vol. 74(1), pages 43-65, September.
    11. Stefania Corsaro & Valentina Simone, 2019. "Adaptive $$l_1$$ l 1 -regularization for short-selling control in portfolio selection," Computational Optimization and Applications, Springer, vol. 72(2), pages 457-478, March.
    12. di Serafino, Daniela & Ruggiero, Valeria & Toraldo, Gerardo & Zanni, Luca, 2018. "On the steplength selection in gradient methods for unconstrained optimization," Applied Mathematics and Computation, Elsevier, vol. 318(C), pages 176-195.
    13. Na Huang, 2022. "On R-linear convergence analysis for a class of gradient methods," Computational Optimization and Applications, Springer, vol. 81(1), pages 161-177, January.
    14. Stefania Corsaro & Valentina De Simone & Zelda Marino, 2021. "Fused Lasso approach in portfolio selection," Annals of Operations Research, Springer, vol. 299(1), pages 47-59, April.
    15. O. P. Ferreira & M. Lemes & L. F. Prudente, 2022. "On the inexact scaled gradient projection method," Computational Optimization and Applications, Springer, vol. 81(1), pages 91-125, January.
    16. Bonettini, Silvia & Prato, Marco & Rebegoldi, Simone, 2016. "A cyclic block coordinate descent method with generalized gradient projections," Applied Mathematics and Computation, Elsevier, vol. 286(C), pages 288-300.
    17. Marco Viola & Mara Sangiovanni & Gerardo Toraldo & Mario R. Guarracino, 2019. "Semi-supervised generalized eigenvalues classification," Annals of Operations Research, Springer, vol. 276(1), pages 249-266, May.
    18. Masoud Fatemi, 2022. "On initial point selection of the steepest descent algorithm for general quadratic functions," Computational Optimization and Applications, Springer, vol. 82(2), pages 329-360, June.
    19. Clóvis Gonzaga & Ruana Schneider, 2016. "On the steepest descent algorithm for quadratic functions," Computational Optimization and Applications, Springer, vol. 63(2), pages 523-542, March.
    20. di Serafino, Daniela & Toraldo, Gerardo & Viola, Marco, 2021. "Using gradient directions to get global convergence of Newton-type methods," Applied Mathematics and Computation, Elsevier, vol. 409(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:89:y:2024:i:1:d:10.1007_s10589-024-00586-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.