IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v82y2022i2d10.1007_s10589-022-00366-y.html
   My bibliography  Save this article

Douglas–Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms

Author

Listed:
  • Andreas Themelis

    (Kyushu University)

  • Lorenzo Stella

    (Amazon)

  • Panagiotis Patrinos

    (KU Leuven)

Abstract

Although the performance of popular optimization algorithms such as the Douglas–Rachford splitting (DRS) and the ADMM is satisfactory in convex and well-scaled problems, ill conditioning and nonconvexity pose a severe obstacle to their reliable employment. Expanding on recent convergence results for DRS and ADMM applied to nonconvex problems, we propose two linesearch algorithms to enhance and robustify these methods by means of quasi-Newton directions. The proposed algorithms are suited for nonconvex problems, require the same black-box oracle of DRS and ADMM, and maintain their (subsequential) convergence properties. Numerical evidence shows that the employment of L-BFGS in the proposed framework greatly improves convergence of DRS and ADMM, making them robust to ill conditioning. Under regularity and nondegeneracy assumptions at the limit point, superlinear convergence is shown when quasi-Newton Broyden directions are adopted.

Suggested Citation

  • Andreas Themelis & Lorenzo Stella & Panagiotis Patrinos, 2022. "Douglas–Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms," Computational Optimization and Applications, Springer, vol. 82(2), pages 395-440, June.
  • Handle: RePEc:spr:coopap:v:82:y:2022:i:2:d:10.1007_s10589-022-00366-y
    DOI: 10.1007/s10589-022-00366-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-022-00366-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-022-00366-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lorenzo Stella & Andreas Themelis & Panagiotis Patrinos, 2017. "Forward–backward quasi-Newton methods for nonsmooth optimization problems," Computational Optimization and Applications, Springer, vol. 67(3), pages 443-487, July.
    2. Bo Jiang & Tianyi Lin & Shiqian Ma & Shuzhong Zhang, 2019. "Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis," Computational Optimization and Applications, Springer, vol. 72(1), pages 115-157, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Maryam Yashtini, 2022. "Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization," Journal of Global Optimization, Springer, vol. 84(4), pages 913-939, December.
    2. Silvia Bonettini & Peter Ochs & Marco Prato & Simone Rebegoldi, 2023. "An abstract convergence framework with application to inertial inexact forward–backward methods," Computational Optimization and Applications, Springer, vol. 84(2), pages 319-362, March.
    3. Pontus Giselsson & Mattias Fält, 2018. "Envelope Functions: Unifications and Further Properties," Journal of Optimization Theory and Applications, Springer, vol. 178(3), pages 673-698, September.
    4. Peter Ochs, 2018. "Local Convergence of the Heavy-Ball Method and iPiano for Non-convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 177(1), pages 153-180, April.
    5. Ryosuke Shimmura & Joe Suzuki, 2024. "Newton-Type Methods with the Proximal Gradient Step for Sparse Estimation," SN Operations Research Forum, Springer, vol. 5(2), pages 1-27, June.
    6. Christian Kanzow & Theresa Lechner, 2021. "Globalized inexact proximal Newton-type methods for nonconvex composite functions," Computational Optimization and Applications, Springer, vol. 78(2), pages 377-410, March.
    7. Tianxiang Liu & Ting Kei Pong, 2017. "Further properties of the forward–backward envelope with applications to difference-of-convex programming," Computational Optimization and Applications, Springer, vol. 67(3), pages 489-520, July.
    8. Shummin Nakayama & Yasushi Narushima & Hiroshi Yabe, 2021. "Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 79(1), pages 127-154, May.
    9. Zehui Jia & Xue Gao & Xingju Cai & Deren Han, 2021. "Local Linear Convergence of the Alternating Direction Method of Multipliers for Nonconvex Separable Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 1-25, January.
    10. Weiwei Kong & Renato D. C. Monteiro, 2023. "An accelerated inexact dampened augmented Lagrangian method for linearly-constrained nonconvex composite optimization problems," Computational Optimization and Applications, Springer, vol. 85(2), pages 509-545, June.
    11. Bastian Pötzl & Anton Schiela & Patrick Jaap, 2022. "Second order semi-smooth Proximal Newton methods in Hilbert spaces," Computational Optimization and Applications, Springer, vol. 82(2), pages 465-498, June.
    12. Yue Xie & Uday V. Shanbhag, 2021. "Tractable ADMM schemes for computing KKT points and local minimizers for $$\ell _0$$ ℓ 0 -minimization problems," Computational Optimization and Applications, Springer, vol. 78(1), pages 43-85, January.
    13. Xihua Zhu & Jiangze Han & Bo Jiang, 2022. "An adaptive high order method for finding third-order critical points of nonconvex optimization," Journal of Global Optimization, Springer, vol. 84(2), pages 369-392, October.
    14. Maryam Yashtini, 2021. "Multi-block Nonconvex Nonsmooth Proximal ADMM: Convergence and Rates Under Kurdyka–Łojasiewicz Property," Journal of Optimization Theory and Applications, Springer, vol. 190(3), pages 966-998, September.
    15. Ghaderi, Susan & Ahookhosh, Masoud & Arany, Adam & Skupin, Alexander & Patrinos, Panagiotis & Moreau, Yves, 2024. "Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions," Applied Mathematics and Computation, Elsevier, vol. 464(C).
    16. Wu, Dawen & Lisser, Abdel, 2024. "Solving Constrained Pseudoconvex Optimization Problems with deep learning-based neurodynamic optimization," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 219(C), pages 424-434.
    17. Tianxiang Liu & Akiko Takeda, 2022. "An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems," Computational Optimization and Applications, Springer, vol. 82(1), pages 141-173, May.
    18. Aviad Aberdam & Amir Beck, 2022. "An Accelerated Coordinate Gradient Descent Algorithm for Non-separable Composite Optimization," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 219-246, June.
    19. Kaizhao Sun & X. Andy Sun, 2023. "A two-level distributed algorithm for nonconvex constrained optimization," Computational Optimization and Applications, Springer, vol. 84(2), pages 609-649, March.
    20. Yanli Liu & Wotao Yin, 2019. "An Envelope for Davis–Yin Splitting and Strict Saddle-Point Avoidance," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 567-587, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:82:y:2022:i:2:d:10.1007_s10589-022-00366-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.