IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v89y2024i2d10.1007_s10589-024-00595-3.html
   My bibliography  Save this article

An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold

Author

Listed:
  • Qinsi Wang

    (Fudan University)

  • Wei Hong Yang

    (Fudan University)

Abstract

Recently, the proximal Newton-type method and its variants have been generalized to solve composite optimization problems over the Stiefel manifold whose objective function is the summation of a smooth function and a nonsmooth function. In this paper, we propose an adaptive quadratically regularized proximal quasi-Newton method, named ARPQN, to solve this class of problems. Under some mild assumptions, the global convergence, the local linear convergence rate and the iteration complexity of ARPQN are established. Numerical experiments and comparisons with other state-of-the-art methods indicate that ARPQN is very promising. We also propose an adaptive quadratically regularized proximal Newton method, named ARPN. It is shown the ARPN method has a local superlinear convergence rate under certain reasonable assumptions, which demonstrates attractive convergence properties of regularized proximal Newton methods.

Suggested Citation

  • Qinsi Wang & Wei Hong Yang, 2024. "An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold," Computational Optimization and Applications, Springer, vol. 89(2), pages 419-457, November.
  • Handle: RePEc:spr:coopap:v:89:y:2024:i:2:d:10.1007_s10589-024-00595-3
    DOI: 10.1007/s10589-024-00595-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-024-00595-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-024-00595-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. O. P. Ferreira & P. R. Oliveira, 1998. "Subgradient Algorithm on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 97(1), pages 93-104, April.
    2. Hiva Ghanbari & Katya Scheinberg, 2018. "Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates," Computational Optimization and Applications, Springer, vol. 69(3), pages 597-627, April.
    3. Wen Huang & Ke Wei, 2023. "An inexact Riemannian proximal gradient method," Computational Optimization and Applications, Springer, vol. 85(1), pages 1-32, May.
    4. Geovani N. GRAPIGLIA & Yurii NESTEROV, 2017. "Regularized Newton methods for minimizing functions with Hölder continuous Hessians," LIDAM Reprints CORE 2846, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    5. Geovani N. Grapiglia & Yurii Nesterov, 2019. "Accelerated regularized Newton methods for minimizing composite convex functions," LIDAM Reprints CORE 3058, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nikita Doikov & Yurii Nesterov, 2021. "Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method," Journal of Optimization Theory and Applications, Springer, vol. 189(1), pages 317-339, April.
    2. João Carlos de O. Souza, 2018. "Proximal Point Methods for Lipschitz Functions on Hadamard Manifolds: Scalar and Vectorial Cases," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 745-760, December.
    3. J. X. Cruz Neto & F. M. O. Jacinto & P. A. Soares & J. C. O. Souza, 2018. "On maximal monotonicity of bifunctions on Hadamard manifolds," Journal of Global Optimization, Springer, vol. 72(3), pages 591-601, November.
    4. Ching-pei Lee & Stephen J. Wright, 2019. "Inexact Successive quadratic approximation for regularized optimization," Computational Optimization and Applications, Springer, vol. 72(3), pages 641-674, April.
    5. Christian Kanzow & Theresa Lechner, 2021. "Globalized inexact proximal Newton-type methods for nonconvex composite functions," Computational Optimization and Applications, Springer, vol. 78(2), pages 377-410, March.
    6. X. M. Wang & C. Li & J. C. Yao, 2015. "Subgradient Projection Algorithms for Convex Feasibility on Riemannian Manifolds with Lower Bounded Curvatures," Journal of Optimization Theory and Applications, Springer, vol. 164(1), pages 202-217, January.
    7. V. S. Amaral & R. Andreani & E. G. Birgin & D. S. Marcondes & J. M. Martínez, 2022. "On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization," Journal of Global Optimization, Springer, vol. 84(3), pages 527-561, November.
    8. Glaydston C. Bento & Jefferson G. Melo, 2012. "Subgradient Method for Convex Feasibility on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 152(3), pages 773-785, March.
    9. Guo-ji Tang & Nan-jing Huang, 2012. "Korpelevich’s method for variational inequality problems on Hadamard manifolds," Journal of Global Optimization, Springer, vol. 54(3), pages 493-509, November.
    10. Nicholas I. M. Gould & Tyrone Rees & Jennifer A. Scott, 2019. "Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 1-35, May.
    11. G. C. Bento & J. X. Cruz Neto, 2013. "A Subgradient Method for Multiobjective Optimization on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 159(1), pages 125-137, October.
    12. Yurii Nesterov, 2024. "Set-Limited Functions and Polynomial-Time Interior-Point Methods," Journal of Optimization Theory and Applications, Springer, vol. 202(1), pages 11-26, July.
    13. Rujun Jiang & Man-Chung Yue & Zhishuo Zhou, 2021. "An accelerated first-order method with complexity analysis for solving cubic regularization subproblems," Computational Optimization and Applications, Springer, vol. 79(2), pages 471-506, June.
    14. Xiao-bo Li & Li-wen Zhou & Nan-jing Huang, 2016. "Gap Functions and Global Error Bounds for Generalized Mixed Variational Inequalities on Hadamard Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 168(3), pages 830-849, March.
    15. G. C. Bento & O. P. Ferreira & P. R. Oliveira, 2012. "Unconstrained Steepest Descent Method for Multicriteria Optimization on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 154(1), pages 88-107, July.
    16. da Silva Alves, Charlan Dellon & Oliveira, Paulo Roberto & Gregório, Ronaldo Malheiros, 2021. "Lα Riemannian weighted centers of mass applied to compose an image filter to diffusion tensor imaging," Applied Mathematics and Computation, Elsevier, vol. 390(C).
    17. Glaydston C. Bento & Orizon P. Ferreira & Jefferson G. Melo, 2017. "Iteration-Complexity of Gradient, Subgradient and Proximal Point Methods on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 173(2), pages 548-562, May.
    18. Peng Zhang & Gejun Bao, 2018. "An Incremental Subgradient Method on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 176(3), pages 711-727, March.
    19. Lei Wang & Xin Liu & Yin Zhang, 2023. "A communication-efficient and privacy-aware distributed algorithm for sparse PCA," Computational Optimization and Applications, Springer, vol. 85(3), pages 1033-1072, July.
    20. E. G. Birgin & J. M. Martínez, 2019. "A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization," Computational Optimization and Applications, Springer, vol. 73(3), pages 707-753, July.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:89:y:2024:i:2:d:10.1007_s10589-024-00595-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.