IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v194y2022i1d10.1007_s10957-022-02016-z.html
   My bibliography  Save this article

The Equivalence of Three Types of Error Bounds for Weakly and Approximately Convex Functions

Author

Listed:
  • Sixuan Bai

    (Chongqing Jiaotong University)

  • Minghua Li

    (Chongqing University of Arts and Sciences)

  • Chengwu Lu

    (Chongqing University of Arts and Sciences)

  • Daoli Zhu

    (Antai College of Economics and Management and Sino-US Global Logistics Institute)

  • Sien Deng

    (Northern Illinois University)

Abstract

We start by establishing the equivalence of three types of error bounds: weak sharp minima, level-set subdifferential error bounds and Łojasiewicz (for short Ł) inequalities for weakly convex functions with exponent $$\alpha \in [0,1]$$ α ∈ [ 0 , 1 ] and approximately convex functions. Then we apply these equivalence results to a class of nonconvex optimization problems, whose objective functions are the sum of a convex function and a composite function with a locally Lipschitz function and a smooth vector-valued function. Finally, applying a characterization for lower-order regularization problems, we show that the level-set subdifferential error bound with exponent 1 and the Ł inequality with exponent $$\frac{1}{2}$$ 1 2 hold at a local minimum point.

Suggested Citation

  • Sixuan Bai & Minghua Li & Chengwu Lu & Daoli Zhu & Sien Deng, 2022. "The Equivalence of Three Types of Error Bounds for Weakly and Approximately Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 220-245, July.
  • Handle: RePEc:spr:joptap:v:194:y:2022:i:1:d:10.1007_s10957-022-02016-z
    DOI: 10.1007/s10957-022-02016-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-022-02016-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-022-02016-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Emilie Chouzenoux & Jean-Christophe Pesquet & Audrey Repetti, 2014. "Variable Metric Forward–Backward Algorithm for Minimizing the Sum of a Differentiable Function and a Convex Function," Journal of Optimization Theory and Applications, Springer, vol. 162(1), pages 107-132, July.
    2. Yaohua Hu & Chong Li & Kaiwen Meng & Xiaoqi Yang, 2021. "Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems," Journal of Global Optimization, Springer, vol. 79(4), pages 853-883, April.
    3. Daoli Zhu & Sien Deng & Minghua Li & Lei Zhao, 2021. "Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 189(3), pages 889-918, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Szilárd Csaba László, 2023. "A Forward–Backward Algorithm With Different Inertial Terms for Structured Non-Convex Minimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 387-427, July.
    2. Hao Wang & Hao Zeng & Jiashan Wang, 2022. "An extrapolated iteratively reweighted $$\ell _1$$ ℓ 1 method with complexity analysis," Computational Optimization and Applications, Springer, vol. 83(3), pages 967-997, December.
    3. Silvia Bonettini & Peter Ochs & Marco Prato & Simone Rebegoldi, 2023. "An abstract convergence framework with application to inertial inexact forward–backward methods," Computational Optimization and Applications, Springer, vol. 84(2), pages 319-362, March.
    4. Emilie Chouzenoux & Jean-Christophe Pesquet & Audrey Repetti, 2016. "A block coordinate variable metric forward–backward algorithm," Journal of Global Optimization, Springer, vol. 66(3), pages 457-485, November.
    5. Peter Ochs, 2018. "Local Convergence of the Heavy-Ball Method and iPiano for Non-convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 177(1), pages 153-180, April.
    6. S. Bonettini & M. Prato & S. Rebegoldi, 2018. "A block coordinate variable metric linesearch based proximal gradient method," Computational Optimization and Applications, Springer, vol. 71(1), pages 5-52, September.
    7. Ching-pei Lee & Stephen J. Wright, 2019. "Inexact Successive quadratic approximation for regularized optimization," Computational Optimization and Applications, Springer, vol. 72(3), pages 641-674, April.
    8. Marc C. Robini & Lihui Wang & Yuemin Zhu, 2024. "The appeals of quadratic majorization–minimization," Journal of Global Optimization, Springer, vol. 89(3), pages 509-558, July.
    9. Radu Ioan Boţ & Ernö Robert Csetnek & Szilárd Csaba László, 2016. "An inertial forward–backward algorithm for the minimization of the sum of two nonconvex functions," EURO Journal on Computational Optimization, Springer;EURO - The Association of European Operational Research Societies, vol. 4(1), pages 3-25, February.
    10. Radu Ioan Boţ & Ernö Robert Csetnek, 2016. "An Inertial Tseng’s Type Proximal Algorithm for Nonsmooth and Nonconvex Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 171(2), pages 600-616, November.
    11. J. C. De Los Reyes & E. Loayza & P. Merino, 2017. "Second-order orthant-based methods with enriched Hessian information for sparse $$\ell _1$$ ℓ 1 -optimization," Computational Optimization and Applications, Springer, vol. 67(2), pages 225-258, June.
    12. Bonettini, S. & Prato, M. & Rebegoldi, S., 2021. "New convergence results for the inexact variable metric forward–backward method," Applied Mathematics and Computation, Elsevier, vol. 392(C).
    13. Daoli Zhu & Sien Deng & Minghua Li & Lei Zhao, 2021. "Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 189(3), pages 889-918, June.
    14. Tianxiang Liu & Akiko Takeda, 2022. "An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems," Computational Optimization and Applications, Springer, vol. 82(1), pages 141-173, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:194:y:2022:i:1:d:10.1007_s10957-022-02016-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.