IDEAS home Printed from https://ideas.repec.org/p/tse/wpaper/125888.html
   My bibliography  Save this paper

A Hölderian backtracking method for min-max and min-min problems

Author

Listed:
  • Bolte, Jérôme
  • Glaudin, Lilian
  • Pauwels, Edouard
  • Serrurier, Matthieu

Abstract

We present a new algorithm to solve min-max or min-min problems out of the convex world. We use rigidity assumptions, ubiquitous in learning, making our method applicable to many optimization problems. Our approach takes advantage of hidden regularity properties and allows us to devise a simple algorithm of ridge type. An original feature of our method is to come with automatic step size adaptation which departs from the usual overly cautious backtracking methods. In a general framework, we provide convergence theoretical guarantees and rates. We apply our findings on simple GAN problems obtaining promising numerical results

Suggested Citation

  • Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
  • Handle: RePEc:tse:wpaper:125888
    as

    Download full text from publisher

    File URL: https://www.tse-fr.eu/sites/default/files/TSE/documents/doc/wp/2021/wp_tse_1243.pdf
    File Function: Working paper
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Jérôme Bolte & Shoham Sabach & Marc Teboulle, 2018. "Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence," Mathematics of Operations Research, INFORMS, vol. 43(4), pages 1210-1232, November.
    2. Hédy Attouch & Jérôme Bolte & Patrick Redont & Antoine Soubeyran, 2010. "Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality," Mathematics of Operations Research, INFORMS, vol. 35(2), pages 438-457, May.
    3. Bolte, Jérôme & Castera, Camille & Pauwels, Edouard & Févotte, Cédric, 2019. "An Inertial Newton Algorithm for Deep Learning," TSE Working Papers 19-1043, Toulouse School of Economics (TSE).
    4. NUNES GRAPIGLIA Geovani, & NESTEROV Yurii,, 2019. "Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives," LIDAM Discussion Papers CORE 2019028, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    5. R. T. Rockafellar, 1981. "Proximal Subgradients, Marginal Values, and Augmented Lagrangians in Nonconvex Optimization," Mathematics of Operations Research, INFORMS, vol. 6(3), pages 424-436, August.
    6. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Radu Ioan Bot & Dang-Khoa Nguyen, 2020. "The Proximal Alternating Direction Method of Multipliers in the Nonconvex Setting: Convergence Analysis and Rates," Mathematics of Operations Research, INFORMS, vol. 45(2), pages 682-712, May.
    2. Bolte, Jérôme & Le, Tam & Pauwels, Edouard & Silveti-Falls, Antonio, 2022. "Nonsmooth Implicit Differentiation for Machine Learning and Optimization," TSE Working Papers 22-1314, Toulouse School of Economics (TSE).
    3. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    4. Vincenzo Basco, 2020. "Representation of Weak Solutions of Convex Hamilton–Jacobi–Bellman Equations on Infinite Horizon," Journal of Optimization Theory and Applications, Springer, vol. 187(2), pages 370-390, November.
    5. Maryam Yashtini, 2022. "Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization," Journal of Global Optimization, Springer, vol. 84(4), pages 913-939, December.
    6. Silvia Bonettini & Peter Ochs & Marco Prato & Simone Rebegoldi, 2023. "An abstract convergence framework with application to inertial inexact forward–backward methods," Computational Optimization and Applications, Springer, vol. 84(2), pages 319-362, March.
    7. Le Thi Khanh Hien & Duy Nhat Phan & Nicolas Gillis, 2022. "Inertial alternating direction method of multipliers for non-convex non-smooth optimization," Computational Optimization and Applications, Springer, vol. 83(1), pages 247-285, September.
    8. Francesco Rinaldi & Damiano Zeffiro, 2023. "Avoiding bad steps in Frank-Wolfe variants," Computational Optimization and Applications, Springer, vol. 84(1), pages 225-264, January.
    9. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    10. Emilie Chouzenoux & Jean-Christophe Pesquet & Audrey Repetti, 2016. "A block coordinate variable metric forward–backward algorithm," Journal of Global Optimization, Springer, vol. 66(3), pages 457-485, November.
    11. Kely D. V. Villacorta & Paulo R. Oliveira & Antoine Soubeyran, 2014. "A Trust-Region Method for Unconstrained Multiobjective Problems with Applications in Satisficing Processes," Journal of Optimization Theory and Applications, Springer, vol. 160(3), pages 865-889, March.
    12. Zhili Ge & Zhongming Wu & Xin Zhang & Qin Ni, 2023. "An extrapolated proximal iteratively reweighted method for nonconvex composite optimization problems," Journal of Global Optimization, Springer, vol. 86(4), pages 821-844, August.
    13. Xi Yin Zheng & Xiaoqi Yang, 2007. "Lagrange Multipliers in Nonsmooth Semi-Infinite Optimization Problems," Mathematics of Operations Research, INFORMS, vol. 32(1), pages 168-181, February.
    14. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    15. Bo Jiang & Tianyi Lin & Shiqian Ma & Shuzhong Zhang, 2019. "Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis," Computational Optimization and Applications, Springer, vol. 72(1), pages 115-157, January.
    16. Zehui Jia & Jieru Huang & Xingju Cai, 2021. "Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems," Journal of Global Optimization, Springer, vol. 80(4), pages 841-864, August.
    17. Dominikus Noll, 2014. "Convergence of Non-smooth Descent Methods Using the Kurdyka–Łojasiewicz Inequality," Journal of Optimization Theory and Applications, Springer, vol. 160(2), pages 553-572, February.
    18. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    19. Peter Ochs, 2018. "Local Convergence of the Heavy-Ball Method and iPiano for Non-convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 177(1), pages 153-180, April.
    20. Glaydston Carvalho Bento & João Xavier Cruz Neto & Antoine Soubeyran & Valdinês Leite Sousa Júnior, 2016. "Dual Descent Methods as Tension Reduction Systems," Journal of Optimization Theory and Applications, Springer, vol. 171(1), pages 209-227, October.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:tse:wpaper:125888. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/tsetofr.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.