IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v172y2017i1d10.1007_s10957-016-1013-z.html
   My bibliography  Save this article

An Approach for Analyzing the Global Rate of Convergence of Quasi-Newton and Truncated-Newton Methods

Author

Listed:
  • T. L. Jensen

    (Aalborg University)

  • M. Diehl

    (University of Freiburg)

Abstract

Quasi-Newton and truncated-Newton methods are popular methods in optimization and are traditionally seen as useful alternatives to the gradient and Newton methods. Throughout the literature, results are found that link quasi-Newton methods to certain first-order methods under various assumptions. We offer a simple proof to show that a range of quasi-Newton methods are first-order methods in the definition of Nesterov. Further, we define a class of generalized first-order methods and show that the truncated-Newton method is a generalized first-order method and that first-order methods and generalized first-order methods share the same worst-case convergence rates. Further, we extend the complexity analysis for smooth strongly convex problems to finite dimensions. An implication of these results is that in a worst-case scenario, the local superlinear or faster convergence rates of quasi-Newton and truncated-Newton methods cannot be effective unless the number of iterations exceeds half the size of the problem dimension.

Suggested Citation

  • T. L. Jensen & M. Diehl, 2017. "An Approach for Analyzing the Global Rate of Convergence of Quasi-Newton and Truncated-Newton Methods," Journal of Optimization Theory and Applications, Springer, vol. 172(1), pages 206-221, January.
  • Handle: RePEc:spr:joptap:v:172:y:2017:i:1:d:10.1007_s10957-016-1013-z
    DOI: 10.1007/s10957-016-1013-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-016-1013-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-016-1013-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yu., 2007. "Gradient methods for minimizing composite objective function," LIDAM Discussion Papers CORE 2007076, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. David F. Shanno, 1978. "Conjugate Gradient Methods with Inexact Searches," Mathematics of Operations Research, INFORMS, vol. 3(3), pages 244-256, August.
    3. D.G. Hull, 2002. "On the Huang Class of Variable Metric Methods," Journal of Optimization Theory and Applications, Springer, vol. 113(1), pages 1-4, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yasushi Narushima & Shummin Nakayama & Masashi Takemura & Hiroshi Yabe, 2023. "Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 639-664, May.
    2. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    3. Mingqiang Li & Congying Han & Ruxin Wang & Tiande Guo, 2017. "Shrinking gradient descent algorithms for total variation regularized image denoising," Computational Optimization and Applications, Springer, vol. 68(3), pages 643-660, December.
    4. Hansen, Christian & Liao, Yuan, 2019. "The Factor-Lasso And K-Step Bootstrap Approach For Inference In High-Dimensional Economic Applications," Econometric Theory, Cambridge University Press, vol. 35(3), pages 465-509, June.
    5. Umberto Amato & Anestis Antoniadis & Italia Feis & Irène Gijbels, 2022. "Penalized wavelet estimation and robust denoising for irregular spaced data," Computational Statistics, Springer, vol. 37(4), pages 1621-1651, September.
    6. Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
    7. Fischer, Manfred M. & Staufer, Petra, 1998. "Optimization in an Error Backpropagation Neural Network Environment with a Performance Test on a Pattern Classification Problem," MPRA Paper 77810, University Library of Munich, Germany.
    8. Kenneth Lange & Eric C. Chi & Hua Zhou, 2014. "A Brief Survey of Modern Optimization for Statisticians," International Statistical Review, International Statistical Institute, vol. 82(1), pages 46-70, April.
    9. D. Russell Luke & Nguyen H. Thao & Matthew K. Tam, 2018. "Quantitative Convergence Analysis of Iterated Expansive, Set-Valued Mappings," Mathematics of Operations Research, INFORMS, vol. 43(4), pages 1143-1176, November.
    10. Qihang Lin & Lin Xiao, 2015. "An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization," Computational Optimization and Applications, Springer, vol. 60(3), pages 633-674, April.
    11. Zachary F. Fisher & Younghoon Kim & Barbara L. Fredrickson & Vladas Pipiras, 2022. "Penalized Estimation and Forecasting of Multiple Subject Intensive Longitudinal Data," Psychometrika, Springer;The Psychometric Society, vol. 87(2), pages 1-29, June.
    12. Yao Dong & He Jiang, 2018. "A Two-Stage Regularization Method for Variable Selection and Forecasting in High-Order Interaction Model," Complexity, Hindawi, vol. 2018, pages 1-12, November.
    13. Radu-Alexandru Dragomir & Alexandre d’Aspremont & Jérôme Bolte, 2021. "Quartic First-Order Methods for Low-Rank Minimization," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 341-363, May.
    14. Qihang Lin & Xi Chen & Javier Peña, 2014. "A sparsity preserving stochastic gradient methods for sparse regression," Computational Optimization and Applications, Springer, vol. 58(2), pages 455-482, June.
    15. Xiubo Liang & Guoqiang Wang & Bo Yu, 2022. "A reduced proximal-point homotopy method for large-scale non-convex BQP," Computational Optimization and Applications, Springer, vol. 81(2), pages 539-567, March.
    16. Ke-Lin Du & Chi-Sing Leung & Wai Ho Mow & M. N. S. Swamy, 2022. "Perceptron: Learning, Generalization, Model Selection, Fault Tolerance, and Role in the Deep Learning Era," Mathematics, MDPI, vol. 10(24), pages 1-46, December.
    17. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    18. N. Mahdavi-Amiri & M. Shaeiri, 2020. "A conjugate gradient sampling method for nonsmooth optimization," 4OR, Springer, vol. 18(1), pages 73-90, March.
    19. Emilie Chouzenoux & Jean-Christophe Pesquet & Audrey Repetti, 2014. "Variable Metric Forward–Backward Algorithm for Minimizing the Sum of a Differentiable Function and a Convex Function," Journal of Optimization Theory and Applications, Springer, vol. 162(1), pages 107-132, July.
    20. Andrei, Neculai, 2010. "Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization," European Journal of Operational Research, Elsevier, vol. 204(3), pages 410-420, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:172:y:2017:i:1:d:10.1007_s10957-016-1013-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.