IDEAS home Printed from https://ideas.repec.org/a/spr/stpapr/v56y2015i3p819-828.html
   My bibliography  Save this article

Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models

Author

Listed:
  • Lichun Wang
  • Yuan You
  • Heng Lian

Abstract

In this short paper, we investigate Lasso regularized generalized linear models in the “small $$n$$ n , large $$p$$ p ” setting. While similar problems have been well-studied with SCAD penalty, the study of Lasso penalty is mostly restricted to the least squares loss function. Here we show the convergence rate of the Lasso penalized estimator as well as the sparsity property under suitable assumptions. We also extend the results to group Lasso regularized models when the variables are naturally grouped. Copyright Springer-Verlag Berlin Heidelberg 2015

Suggested Citation

  • Lichun Wang & Yuan You & Heng Lian, 2015. "Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models," Statistical Papers, Springer, vol. 56(3), pages 819-828, August.
  • Handle: RePEc:spr:stpapr:v:56:y:2015:i:3:p:819-828
    DOI: 10.1007/s00362-014-0609-3
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s00362-014-0609-3
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s00362-014-0609-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    2. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    3. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    4. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    5. Michael Schomaker, 2012. "Shrinkage averaging estimation," Statistical Papers, Springer, vol. 53(4), pages 1015-1034, November.
    6. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xiaoli Gao, 2018. "A flexible shrinkage operator for fussy grouped variable selection," Statistical Papers, Springer, vol. 59(3), pages 985-1008, September.
    2. Mingrui Zhong & Zanhua Yin & Zhichao Wang, 2023. "Variable Selection for Sparse Logistic Regression with Grouped Variables," Mathematics, MDPI, vol. 11(24), pages 1-21, December.
    3. Mingqiu Wang & Guo-Liang Tian, 2019. "Adaptive group Lasso for high-dimensional generalized linear models," Statistical Papers, Springer, vol. 60(5), pages 1469-1486, October.
    4. Kristoffer Pons Bertelsen, 2022. "The Prior Adaptive Group Lasso and the Factor Zoo," CREATES Research Papers 2022-05, Department of Economics and Business Economics, Aarhus University.
    5. Gabriela Ciuperca, 2019. "Adaptive group LASSO selection in quantile models," Statistical Papers, Springer, vol. 60(1), pages 173-197, February.
    6. Chen, Yang & Luo, Ziyan & Kong, Lingchen, 2021. "ℓ2,0-norm based selection and estimation for multivariate generalized linear models," Journal of Multivariate Analysis, Elsevier, vol. 185(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    2. Lee, Wonyul & Liu, Yufeng, 2012. "Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood," Journal of Multivariate Analysis, Elsevier, vol. 111(C), pages 241-255.
    3. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    4. Lam, Clifford, 2008. "Estimation of large precision matrices through block penalization," LSE Research Online Documents on Economics 31543, London School of Economics and Political Science, LSE Library.
    5. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    6. Yanfang Zhang & Chuanhua Wei & Xiaolin Liu, 2022. "Group Logistic Regression Models with l p,q Regularization," Mathematics, MDPI, vol. 10(13), pages 1-15, June.
    7. Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
    8. Fabian Scheipl & Thomas Kneib & Ludwig Fahrmeir, 2013. "Penalized likelihood and Bayesian function selection in regression models," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 97(4), pages 349-385, October.
    9. Gabriel E Hoffman & Benjamin A Logsdon & Jason G Mezey, 2013. "PUMA: A Unified Framework for Penalized Multiple Regression Analysis of GWAS Data," PLOS Computational Biology, Public Library of Science, vol. 9(6), pages 1-19, June.
    10. Kaida Cai & Hua Shen & Xuewen Lu, 2022. "Adaptive bi-level variable selection for multivariate failure time model with a diverging number of covariates," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(4), pages 968-993, December.
    11. Nanshan, Muye & Zhang, Nan & Xun, Xiaolei & Cao, Jiguo, 2022. "Dynamical modeling for non-Gaussian data with high-dimensional sparse ordinary differential equations," Computational Statistics & Data Analysis, Elsevier, vol. 173(C).
    12. Eduardo F. Mendes & Gabriel J. P. Pinto, 2023. "Generalized Information Criteria for Structured Sparse Models," Papers 2309.01764, arXiv.org.
    13. repec:hum:wpaper:sfb649dp2012-061 is not listed on IDEAS
    14. Young Joo Yoon & Cheolwoo Park & Erik Hofmeister & Sangwook Kang, 2012. "Group variable selection in cardiopulmonary cerebral resuscitation data for veterinary patients," Journal of Applied Statistics, Taylor & Francis Journals, vol. 39(7), pages 1605-1621, January.
    15. Haibin Zhang & Juan Wei & Meixia Li & Jie Zhou & Miantao Chao, 2014. "On proximal gradient method for the convex problems regularized with the group reproducing kernel norm," Journal of Global Optimization, Springer, vol. 58(1), pages 169-188, January.
    16. He, Xin & Mao, Xiaojun & Wang, Zhonglei, 2024. "Nonparametric augmented probability weighting with sparsity," Computational Statistics & Data Analysis, Elsevier, vol. 191(C).
    17. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    18. Matsui, Hidetoshi, 2014. "Variable and boundary selection for functional data via multiclass logistic regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 78(C), pages 176-185.
    19. Ziqi Chen & Chenlei Leng, 2016. "Dynamic Covariance Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1196-1207, July.
    20. G. Yi & J. Q. Shi & T. Choi, 2011. "Penalized Gaussian Process Regression and Classification for High-Dimensional Nonlinear Data," Biometrics, The International Biometric Society, vol. 67(4), pages 1285-1294, December.
    21. Mingqiu Wang & Guo-Liang Tian, 2016. "Robust group non-convex estimations for high-dimensional partially linear models," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(1), pages 49-67, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:stpapr:v:56:y:2015:i:3:p:819-828. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.