IDEAS home Printed from https://ideas.repec.org/a/spr/jglopt/v84y2022i2d10.1007_s10898-022-01151-1.html
   My bibliography  Save this article

An adaptive high order method for finding third-order critical points of nonconvex optimization

Author

Listed:
  • Xihua Zhu

    (Shanghai University of Finance and Economics)

  • Jiangze Han

    (the University of British Columbia)

  • Bo Jiang

    (Shanghai University of Finance and Economics)

Abstract

Recently, the optimization methods for computing higher-order critical points of nonconvex problems attract growing research interest (Anandkumar Conference on Learning Theory 81-102, 2016), (Cartis Found Comput Math 18:1073-1107, 2018), (Cartis SIAM J Optim 30:513-541, 2020), (Chen Math Program 187:47-78, 2021) , as they are able to exclude the so-called degenerate saddle points and reach a solution with better quality. Despite theoretical developments in (Anandkumar Conference on Learning Theory 81-102, 2016), (Cartis Found Comput Math 18:1073-1107, 2018), (Cartis SIAM J Optim 30:513-541, 2020), (Chen Math Program 187:47-78, 2021) , the corresponding numerical experiments are missing. This paper proposes an implementable higher-order method, named adaptive high order method (AHOM), to find the third-order critical points. AHOM is achieved by solving an “easier” subproblem and incorporating the adaptive strategy of parameter-tuning in each iteration of the algorithm. The iteration complexity of the proposed method is established. Some preliminary numerical results are provided to show that AHOM can escape from the degenerate saddle points, where the second-order method could possibly get stuck.

Suggested Citation

  • Xihua Zhu & Jiangze Han & Bo Jiang, 2022. "An adaptive high order method for finding third-order critical points of nonconvex optimization," Journal of Global Optimization, Springer, vol. 84(2), pages 369-392, October.
  • Handle: RePEc:spr:jglopt:v:84:y:2022:i:2:d:10.1007_s10898-022-01151-1
    DOI: 10.1007/s10898-022-01151-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10898-022-01151-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10898-022-01151-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Nicholas I. M. Gould & Tyrone Rees & Jennifer A. Scott, 2019. "Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 1-35, May.
    2. Bo Jiang & Tianyi Lin & Shiqian Ma & Shuzhong Zhang, 2019. "Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis," Computational Optimization and Applications, Springer, vol. 72(1), pages 115-157, January.
    3. NESTEROV, Yurii & POLYAK, B.T., 2006. "Cubic regularization of Newton method and its global performance," LIDAM Reprints CORE 1927, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kenji Ueda & Nobuo Yamashita, 2010. "On a Global Complexity Bound of the Levenberg-Marquardt Method," Journal of Optimization Theory and Applications, Springer, vol. 147(3), pages 443-453, December.
    2. Anton Rodomanov & Yurii Nesterov, 2020. "Smoothness Parameter of Power of Euclidean Norm," Journal of Optimization Theory and Applications, Springer, vol. 185(2), pages 303-326, May.
    3. Silvia Berra & Alessandro Torraca & Federico Benvenuto & Sara Sommariva, 2024. "Combined Newton-Gradient Method for Constrained Root-Finding in Chemical Reaction Networks," Journal of Optimization Theory and Applications, Springer, vol. 200(1), pages 404-427, January.
    4. Ariizumi, Shumpei & Yamakawa, Yuya & Yamashita, Nobuo, 2024. "Convergence properties of Levenberg–Marquardt methods with generalized regularization terms," Applied Mathematics and Computation, Elsevier, vol. 463(C).
    5. Maryam Yashtini, 2022. "Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization," Journal of Global Optimization, Springer, vol. 84(4), pages 913-939, December.
    6. Seonho Park & Seung Hyun Jung & Panos M. Pardalos, 2020. "Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 953-971, March.
    7. Nadav Hallak & Marc Teboulle, 2020. "Finding Second-Order Stationary Points in Constrained Minimization: A Feasible Direction Approach," Journal of Optimization Theory and Applications, Springer, vol. 186(2), pages 480-503, August.
    8. Weiwei Kong & Jefferson G. Melo & Renato D. C. Monteiro, 2020. "An efficient adaptive accelerated inexact proximal point method for solving linearly constrained nonconvex composite problems," Computational Optimization and Applications, Springer, vol. 76(2), pages 305-346, June.
    9. Alkousa, Mohammad & Stonyakin, Fedor & Gasnikov, Alexander & Abdo, Asmaa & Alcheikh, Mohammad, 2024. "Higher degree inexact model for optimization problems," Chaos, Solitons & Fractals, Elsevier, vol. 186(C).
    10. Chuan He & Heng Huang & Zhaosong Lu, 2024. "A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 843-894, December.
    11. Geovani Nunes Grapiglia & Jinyun Yuan & Ya-xiang Yuan, 2016. "Nonlinear Stepsize Control Algorithms: Complexity Bounds for First- and Second-Order Optimality," Journal of Optimization Theory and Applications, Springer, vol. 171(3), pages 980-997, December.
    12. Zehui Jia & Xue Gao & Xingju Cai & Deren Han, 2021. "Local Linear Convergence of the Alternating Direction Method of Multipliers for Nonconvex Separable Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 1-25, January.
    13. Francisco Facchinei & Vyacheslav Kungurtsev & Lorenzo Lampariello & Gesualdo Scutari, 2021. "Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity," Mathematics of Operations Research, INFORMS, vol. 46(2), pages 595-627, May.
    14. Yurii Nesterov, 2024. "Set-Limited Functions and Polynomial-Time Interior-Point Methods," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 11-26, July.
    15. Rujun Jiang & Man-Chung Yue & Zhishuo Zhou, 2021. "An accelerated first-order method with complexity analysis for solving cubic regularization subproblems," Computational Optimization and Applications, Springer, vol. 79(2), pages 471-506, June.
    16. Weiwei Kong & Renato D. C. Monteiro, 2023. "An accelerated inexact dampened augmented Lagrangian method for linearly-constrained nonconvex composite optimization problems," Computational Optimization and Applications, Springer, vol. 85(2), pages 509-545, June.
    17. Kenji Ueda & Nobuo Yamashita, 2012. "Global Complexity Bound Analysis of the Levenberg–Marquardt Method for Nonsmooth Equations and Its Application to the Nonlinear Complementarity Problem," Journal of Optimization Theory and Applications, Springer, vol. 152(2), pages 450-467, February.
    18. Elizabeth Karas & Sandra Santos & Benar Svaiter, 2015. "Algebraic rules for quadratic regularization of Newton’s method," Computational Optimization and Applications, Springer, vol. 60(2), pages 343-376, March.
    19. Geovani N. Grapiglia & Ekkehard W. Sachs, 2017. "On the worst-case evaluation complexity of non-monotone line search algorithms," Computational Optimization and Applications, Springer, vol. 68(3), pages 555-577, December.
    20. Andreas Themelis & Lorenzo Stella & Panagiotis Patrinos, 2022. "Douglas–Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms," Computational Optimization and Applications, Springer, vol. 82(2), pages 395-440, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jglopt:v:84:y:2022:i:2:d:10.1007_s10898-022-01151-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.