IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v69y2018i2d10.1007_s10589-017-9954-1.html
   My bibliography  Save this article

A proximal difference-of-convex algorithm with extrapolation

Author

Listed:
  • Bo Wen

    (School of Science, Hebei University of Technology
    Harbin Institute of Technology
    The Hong Kong Polytechnic University)

  • Xiaojun Chen

    (The Hong Kong Polytechnic University)

  • Ting Kei Pong

    (The Hong Kong Polytechnic University)

Abstract

We consider a class of difference-of-convex (DC) optimization problems whose objective is level-bounded and is the sum of a smooth convex function with Lipschitz gradient, a proper closed convex function and a continuous concave function. While this kind of problems can be solved by the classical difference-of-convex algorithm (DCA) (Pham et al. Acta Math Vietnam 22:289–355, 1997), the difficulty of the subproblems of this algorithm depends heavily on the choice of DC decomposition. Simpler subproblems can be obtained by using a specific DC decomposition described in Pham et al. (SIAM J Optim 8:476–505, 1998). This decomposition has been proposed in numerous work such as Gotoh et al. (DC formulations and algorithms for sparse optimization problems, 2017), and we refer to the resulting DCA as the proximal DCA. Although the subproblems are simpler, the proximal DCA is the same as the proximal gradient algorithm when the concave part of the objective is void, and hence is potentially slow in practice. In this paper, motivated by the extrapolation techniques for accelerating the proximal gradient algorithm in the convex settings, we consider a proximal difference-of-convex algorithm with extrapolation to possibly accelerate the proximal DCA. We show that any cluster point of the sequence generated by our algorithm is a stationary point of the DC optimization problem for a fairly general choice of extrapolation parameters: in particular, the parameters can be chosen as in FISTA with fixed restart (O’Donoghue and Candès in Found Comput Math 15, 715–732, 2015). In addition, by assuming the Kurdyka-Łojasiewicz property of the objective and the differentiability of the concave part, we establish global convergence of the sequence generated by our algorithm and analyze its convergence rate. Our numerical experiments on two difference-of-convex regularized least squares models show that our algorithm usually outperforms the proximal DCA and the general iterative shrinkage and thresholding algorithm proposed in Gong et al. (A general iterative shrinkage and thresholding algorithm for non-convex regularized optimization problems, 2013).

Suggested Citation

  • Bo Wen & Xiaojun Chen & Ting Kei Pong, 2018. "A proximal difference-of-convex algorithm with extrapolation," Computational Optimization and Applications, Springer, vol. 69(2), pages 297-324, March.
  • Handle: RePEc:spr:coopap:v:69:y:2018:i:2:d:10.1007_s10589-017-9954-1
    DOI: 10.1007/s10589-017-9954-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-017-9954-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-017-9954-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hédy Attouch & Jérôme Bolte & Patrick Redont & Antoine Soubeyran, 2010. "Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality," Mathematics of Operations Research, INFORMS, vol. 35(2), pages 438-457, May.
    2. NESTEROV, Yurii, 2013. "Gradient methods for minimizing composite functions," LIDAM Reprints CORE 2510, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    4. Wei Bian & Xiaojun Chen, 2017. "Optimality and Complexity for Constrained Optimization Problems with Nonconvex Regularization," Mathematics of Operations Research, INFORMS, vol. 42(4), pages 1063-1084, November.
    5. Tianxiang Liu & Ting Kei Pong, 2017. "Further properties of the forward–backward envelope with applications to difference-of-convex programming," Computational Optimization and Applications, Springer, vol. 67(3), pages 489-520, July.
    6. Patrick L. Combettes & Jean-Christophe Pesquet, 2011. "Proximal Splitting Methods in Signal Processing," Springer Optimization and Its Applications, in: Heinz H. Bauschke & Regina S. Burachik & Patrick L. Combettes & Veit Elser & D. Russell Luke & Henry (ed.), Fixed-Point Algorithms for Inverse Problems in Science and Engineering, chapter 0, pages 185-212, Springer.
    7. Hoang Tuy, 2016. "Convex Analysis and Global Optimization," Springer Optimization and Its Applications, Springer, edition 2, number 978-3-319-31484-6, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Glaydston Carvalho Bento & Sandro Dimy Barbosa Bitar & João Xavier Cruz Neto & Antoine Soubeyran & João Carlos Oliveira Souza, 2020. "A proximal point method for difference of convex functions in multi-objective optimization with application to group dynamic problems," Computational Optimization and Applications, Springer, vol. 75(1), pages 263-290, January.
    2. Jinxin Wang & Zengde Deng & Taoli Zheng & Anthony Man-Cho So, 2020. "Sparse High-Order Portfolios via Proximal DCA and SCA," Papers 2008.12953, arXiv.org, revised Jun 2021.
    3. Chungen Shen & Xiao Liu, 2021. "Solving nonnegative sparsity-constrained optimization via DC quadratic-piecewise-linear approximations," Journal of Global Optimization, Springer, vol. 81(4), pages 1019-1055, December.
    4. Weiwei Kong & Renato D. C. Monteiro, 2022. "Accelerated inexact composite gradient methods for nonconvex spectral optimization problems," Computational Optimization and Applications, Springer, vol. 82(3), pages 673-715, July.
    5. Kai Tu & Haibin Zhang & Huan Gao & Junkai Feng, 2020. "A hybrid Bregman alternating direction method of multipliers for the linearly constrained difference-of-convex problems," Journal of Global Optimization, Springer, vol. 76(4), pages 665-693, April.
    6. Peiran Yu & Ting Kei Pong, 2019. "Iteratively reweighted $$\ell _1$$ ℓ 1 algorithms with extrapolation," Computational Optimization and Applications, Springer, vol. 73(2), pages 353-386, June.
    7. Hongbo Dong & Min Tao, 2021. "On the Linear Convergence to Weak/Standard d-Stationary Points of DCA-Based Algorithms for Structured Nonsmooth DC Programming," Journal of Optimization Theory and Applications, Springer, vol. 189(1), pages 190-220, April.
    8. W. Ackooij & S. Demassey & P. Javal & H. Morais & W. Oliveira & B. Swaminathan, 2021. "A bundle method for nonsmooth DC programming with application to chance-constrained problems," Computational Optimization and Applications, Springer, vol. 78(2), pages 451-490, March.
    9. Yldenilson Torres Almeida & João Xavier Cruz Neto & Paulo Roberto Oliveira & João Carlos de Oliveira Souza, 2020. "A modified proximal point method for DC functions on Hadamard manifolds," Computational Optimization and Applications, Springer, vol. 76(3), pages 649-673, July.
    10. Tianxiang Liu & Ting Kei Pong & Akiko Takeda, 2019. "A refined convergence analysis of $$\hbox {pDCA}_{e}$$ pDCA e with applications to simultaneous sparse recovery and outlier detection," Computational Optimization and Applications, Springer, vol. 73(1), pages 69-100, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhongming Wu & Min Li, 2019. "General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 129-158, May.
    2. Lorenzo Stella & Andreas Themelis & Panagiotis Patrinos, 2017. "Forward–backward quasi-Newton methods for nonsmooth optimization problems," Computational Optimization and Applications, Springer, vol. 67(3), pages 443-487, July.
    3. Lei Yang, 2024. "Proximal Gradient Method with Extrapolation and Line Search for a Class of Non-convex and Non-smooth Problems," Journal of Optimization Theory and Applications, Springer, vol. 200(1), pages 68-103, January.
    4. Tianxiang Liu & Ting Kei Pong & Akiko Takeda, 2019. "A refined convergence analysis of $$\hbox {pDCA}_{e}$$ pDCA e with applications to simultaneous sparse recovery and outlier detection," Computational Optimization and Applications, Springer, vol. 73(1), pages 69-100, May.
    5. Fan Wu & Wei Bian, 2020. "Accelerated iterative hard thresholding algorithm for $$l_0$$l0 regularized regression problem," Journal of Global Optimization, Springer, vol. 76(4), pages 819-840, April.
    6. Peiran Yu & Ting Kei Pong, 2019. "Iteratively reweighted $$\ell _1$$ ℓ 1 algorithms with extrapolation," Computational Optimization and Applications, Springer, vol. 73(2), pages 353-386, June.
    7. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    8. Bo Jiang & Tianyi Lin & Shiqian Ma & Shuzhong Zhang, 2019. "Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis," Computational Optimization and Applications, Springer, vol. 72(1), pages 115-157, January.
    9. Sun, Shilin & Wang, Tianyang & Yang, Hongxing & Chu, Fulei, 2022. "Damage identification of wind turbine blades using an adaptive method for compressive beamforming based on the generalized minimax-concave penalty function," Renewable Energy, Elsevier, vol. 181(C), pages 59-70.
    10. Anda Tang & Pei Quan & Lingfeng Niu & Yong Shi, 2022. "A Survey for Sparse Regularization Based Compression Methods," Annals of Data Science, Springer, vol. 9(4), pages 695-722, August.
    11. W. Ackooij & S. Demassey & P. Javal & H. Morais & W. Oliveira & B. Swaminathan, 2021. "A bundle method for nonsmooth DC programming with application to chance-constrained problems," Computational Optimization and Applications, Springer, vol. 78(2), pages 451-490, March.
    12. Nguyen Hieu Thao, 2018. "A convergent relaxation of the Douglas–Rachford algorithm," Computational Optimization and Applications, Springer, vol. 70(3), pages 841-863, July.
    13. Jérôme Bolte & Edouard Pauwels, 2016. "Majorization-Minimization Procedures and Convergence of SQP Methods for Semi-Algebraic and Tame Programs," Mathematics of Operations Research, INFORMS, vol. 41(2), pages 442-465, May.
    14. Minh Pham & Xiaodong Lin & Andrzej Ruszczyński & Yu Du, 2021. "An outer–inner linearization method for non-convex and nondifferentiable composite regularization problems," Journal of Global Optimization, Springer, vol. 81(1), pages 179-202, September.
    15. Liu, Jingjing & Ma, Ruijie & Zeng, Xiaoyang & Liu, Wanquan & Wang, Mingyu & Chen, Hui, 2021. "An efficient non-convex total variation approach for image deblurring and denoising," Applied Mathematics and Computation, Elsevier, vol. 397(C).
    16. Min Li & Zhongming Wu, 2019. "Convergence Analysis of the Generalized Splitting Methods for a Class of Nonconvex Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 183(2), pages 535-565, November.
    17. Yakui Huang & Hongwei Liu, 2016. "Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization," Computational Optimization and Applications, Springer, vol. 65(3), pages 671-698, December.
    18. Yaohua Hu & Chong Li & Kaiwen Meng & Xiaoqi Yang, 2021. "Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems," Journal of Global Optimization, Springer, vol. 79(4), pages 853-883, April.
    19. Dongdong Zhang & Shaohua Pan & Shujun Bi & Defeng Sun, 2023. "Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound," Computational Optimization and Applications, Springer, vol. 86(2), pages 627-667, November.
    20. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:69:y:2018:i:2:d:10.1007_s10589-017-9954-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.