IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v58y2014i2p455-482.html
   My bibliography  Save this article

A sparsity preserving stochastic gradient methods for sparse regression

Author

Listed:
  • Qihang Lin
  • Xi Chen
  • Javier Peña

Abstract

We propose a new stochastic first-order algorithm for solving sparse regression problems. In each iteration, our algorithm utilizes a stochastic oracle of the subgradient of the objective function. Our algorithm is based on a stochastic version of the estimate sequence technique introduced by Nesterov (Introductory lectures on convex optimization: a basic course, Kluwer, Amsterdam, 2003 ). The convergence rate of our algorithm depends continuously on the noise level of the gradient. In particular, in the limiting case of noiseless gradient, the convergence rate of our algorithm is the same as that of optimal deterministic gradient algorithms. We also establish some large deviation properties of our algorithm. Unlike existing stochastic gradient methods with optimal convergence rates, our algorithm has the advantage of readily enforcing sparsity at all iterations, which is a critical property for applications of sparse regressions. Copyright Springer Science+Business Media New York 2014

Suggested Citation

  • Qihang Lin & Xi Chen & Javier Peña, 2014. "A sparsity preserving stochastic gradient methods for sparse regression," Computational Optimization and Applications, Springer, vol. 58(2), pages 455-482, June.
  • Handle: RePEc:spr:coopap:v:58:y:2014:i:2:p:455-482
    DOI: 10.1007/s10589-013-9633-9
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10589-013-9633-9
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10589-013-9633-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yu., 2007. "Gradient methods for minimizing composite objective function," LIDAM Discussion Papers CORE 2007076, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. NESTEROV, Yu., 2005. "Smooth minimization of non-smooth functions," LIDAM Reprints CORE 1819, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lorenzo Rosasco & Silvia Villa & Bang Công Vũ, 2016. "Stochastic Forward–Backward Splitting for Monotone Inclusions," Journal of Optimization Theory and Applications, Springer, vol. 169(2), pages 388-406, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
    2. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    3. Xiubo Liang & Guoqiang Wang & Bo Yu, 2022. "A reduced proximal-point homotopy method for large-scale non-convex BQP," Computational Optimization and Applications, Springer, vol. 81(2), pages 539-567, March.
    4. Jiang, He & Luo, Shihua & Dong, Yao, 2021. "Simultaneous feature selection and clustering based on square root optimization," European Journal of Operational Research, Elsevier, vol. 289(1), pages 214-231.
    5. Renato D. C. Monteiro & Camilo Ortiz & Benar F. Svaiter, 2016. "An adaptive accelerated first-order method for convex optimization," Computational Optimization and Applications, Springer, vol. 64(1), pages 31-73, May.
    6. Bo Wen & Xiaoping Xue, 2019. "On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems," Journal of Global Optimization, Springer, vol. 75(3), pages 767-787, November.
    7. Guoqiang Wang & Bo Yu, 2019. "PAL-Hom method for QP and an application to LP," Computational Optimization and Applications, Springer, vol. 73(1), pages 311-352, May.
    8. Li, Peili & Xiao, Yunhai, 2018. "An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation," Computational Statistics & Data Analysis, Elsevier, vol. 128(C), pages 292-307.
    9. Lingxue Zhang & Seyoung Kim, 2014. "Learning Gene Networks under SNP Perturbations Using eQTL Datasets," PLOS Computational Biology, Public Library of Science, vol. 10(2), pages 1-20, February.
    10. Majid Jahani & Naga Venkata C. Gudapati & Chenxin Ma & Rachael Tappenden & Martin Takáč, 2021. "Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences," Computational Optimization and Applications, Springer, vol. 79(2), pages 369-404, June.
    11. Xiaolong Qin & Nguyen Thai An, 2019. "Smoothing algorithms for computing the projection onto a Minkowski sum of convex sets," Computational Optimization and Applications, Springer, vol. 74(3), pages 821-850, December.
    12. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    13. Guillaume Sagnol & Edouard Pauwels, 2019. "An unexpected connection between Bayes A-optimal designs and the group lasso," Statistical Papers, Springer, vol. 60(2), pages 565-584, April.
    14. Bakalli, Gaetan & Guerrier, Stéphane & Scaillet, Olivier, 2023. "A penalized two-pass regression to predict stock returns with time-varying risk premia," Journal of Econometrics, Elsevier, vol. 237(2).
    15. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.
    16. Wongsa-art, Pipat & Kim, Namhyun & Xia, Yingcun & Moscone, Francesco, 2024. "Varying coefficient panel data models and methods under correlated error components: Application to disparities in mental health services in England," Regional Science and Urban Economics, Elsevier, vol. 106(C).
    17. Dong, C. & Li, S., 2021. "Specification Lasso and an Application in Financial Markets," Cambridge Working Papers in Economics 2139, Faculty of Economics, University of Cambridge.
    18. Lam, Clifford, 2008. "Estimation of large precision matrices through block penalization," LSE Research Online Documents on Economics 31543, London School of Economics and Political Science, LSE Library.
    19. Weiyang Ding & Michael K. Ng & Wenxing Zhang, 2024. "A generalized alternating direction implicit method for consensus optimization: application to distributed sparse logistic regression," Journal of Global Optimization, Springer, vol. 90(3), pages 727-753, November.
    20. Gregory Vaughan & Robert Aseltine & Kun Chen & Jun Yan, 2017. "Stagewise generalized estimating equations with grouped variables," Biometrics, The International Biometric Society, vol. 73(4), pages 1332-1342, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:58:y:2014:i:2:p:455-482. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.