IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v60y2015i3p633-674.html
   My bibliography  Save this article

An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization

Author

Listed:
  • Qihang Lin
  • Lin Xiao

Abstract

We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line-search procedure, and achieves the optimal iteration complexity for such composite optimization problems. In case the strong convexity parameter is unknown, we also develop an adaptive scheme that can automatically estimate it on the fly, at the cost of a slightly worse iteration complexity. Then we focus on the special case of solving the $$\ell _1$$ ℓ 1 -regularized least-squares problem in the high-dimensional setting. In such a context, the smooth part of the objective (least-squares) is not strongly convex over the entire domain. Nevertheless, we can exploit its restricted strong convexity over sparse vectors using the adaptive APG method combined with a homotopy continuation scheme. We show that such a combination leads to a global geometric rate of convergence, and the overall iteration complexity has a weaker dependency on the restricted condition number than previous work. Copyright Springer Science+Business Media New York 2015

Suggested Citation

  • Qihang Lin & Lin Xiao, 2015. "An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization," Computational Optimization and Applications, Springer, vol. 60(3), pages 633-674, April.
  • Handle: RePEc:spr:coopap:v:60:y:2015:i:3:p:633-674
    DOI: 10.1007/s10589-014-9694-4
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10589-014-9694-4
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10589-014-9694-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yu., 2007. "Gradient methods for minimizing composite objective function," LIDAM Discussion Papers CORE 2007076, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Olivier Fercoq & Zheng Qu, 2020. "Restarting the accelerated coordinate descent method with a rough strong convexity estimate," Computational Optimization and Applications, Springer, vol. 75(1), pages 63-91, January.
    2. Masaru Ito & Mituhiro Fukuda, 2021. "Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 770-804, March.
    3. Qihang Lin & Runchao Ma & Yangyang Xu, 2022. "Complexity of an inexact proximal-point penalty method for constrained smooth non-convex optimization," Computational Optimization and Applications, Springer, vol. 82(1), pages 175-224, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    2. Kohli, Priya & Garcia, Tanya P. & Pourahmadi, Mohsen, 2016. "Modeling the Cholesky factors of covariance matrices of multivariate longitudinal data," Journal of Multivariate Analysis, Elsevier, vol. 145(C), pages 87-100.
    3. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. Jiahe Lin & George Michailidis, 2019. "Approximate Factor Models with Strongly Correlated Idiosyncratic Errors," Papers 1912.04123, arXiv.org.
    5. Mingqiang Li & Congying Han & Ruxin Wang & Tiande Guo, 2017. "Shrinking gradient descent algorithms for total variation regularized image denoising," Computational Optimization and Applications, Springer, vol. 68(3), pages 643-660, December.
    6. Hansen, Christian & Liao, Yuan, 2019. "The Factor-Lasso And K-Step Bootstrap Approach For Inference In High-Dimensional Economic Applications," Econometric Theory, Cambridge University Press, vol. 35(3), pages 465-509, June.
    7. Lingxue Zhang & Seyoung Kim, 2014. "Learning Gene Networks under SNP Perturbations Using eQTL Datasets," PLOS Computational Biology, Public Library of Science, vol. 10(2), pages 1-20, February.
    8. Umberto Amato & Anestis Antoniadis & Italia Feis & Irène Gijbels, 2022. "Penalized wavelet estimation and robust denoising for irregular spaced data," Computational Statistics, Springer, vol. 37(4), pages 1621-1651, September.
    9. Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
    10. Wachirapong Jirakitpuwapat & Poom Kumam & Yeol Je Cho & Kanokwan Sitthithakerngkiet, 2019. "A General Algorithm for the Split Common Fixed Point Problem with Its Applications to Signal Processing," Mathematics, MDPI, vol. 7(3), pages 1-20, February.
    11. Kenneth Lange & Eric C. Chi & Hua Zhou, 2014. "A Brief Survey of Modern Optimization for Statisticians," International Statistical Review, International Statistical Institute, vol. 82(1), pages 46-70, April.
    12. D. Russell Luke & Nguyen H. Thao & Matthew K. Tam, 2018. "Quantitative Convergence Analysis of Iterated Expansive, Set-Valued Mappings," Mathematics of Operations Research, INFORMS, vol. 43(4), pages 1143-1176, November.
    13. Majid Jahani & Naga Venkata C. Gudapati & Chenxin Ma & Rachael Tappenden & Martin Takáč, 2021. "Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences," Computational Optimization and Applications, Springer, vol. 79(2), pages 369-404, June.
    14. Zachary F. Fisher & Younghoon Kim & Barbara L. Fredrickson & Vladas Pipiras, 2022. "Penalized Estimation and Forecasting of Multiple Subject Intensive Longitudinal Data," Psychometrika, Springer;The Psychometric Society, vol. 87(2), pages 1-29, June.
    15. Li, Xin & Wu, Dongya & Li, Chong & Wang, Jinhua & Yao, Jen-Chih, 2020. "Sparse recovery via nonconvex regularized M-estimators over ℓq-balls," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    16. Yao Dong & He Jiang, 2018. "A Two-Stage Regularization Method for Variable Selection and Forecasting in High-Order Interaction Model," Complexity, Hindawi, vol. 2018, pages 1-12, November.
    17. Radu-Alexandru Dragomir & Alexandre d’Aspremont & Jérôme Bolte, 2021. "Quartic First-Order Methods for Low-Rank Minimization," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 341-363, May.
    18. Min Li & Li-Zhi Liao & Xiaoming Yuan, 2013. "Inexact Alternating Direction Methods of Multipliers with Logarithmic–Quadratic Proximal Regularization," Journal of Optimization Theory and Applications, Springer, vol. 159(2), pages 412-436, November.
    19. Qihang Lin & Xi Chen & Javier Peña, 2014. "A sparsity preserving stochastic gradient methods for sparse regression," Computational Optimization and Applications, Springer, vol. 58(2), pages 455-482, June.
    20. Xiubo Liang & Guoqiang Wang & Bo Yu, 2022. "A reduced proximal-point homotopy method for large-scale non-convex BQP," Computational Optimization and Applications, Springer, vol. 81(2), pages 539-567, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:60:y:2015:i:3:p:633-674. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.