IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v70y2014icp198-211.html
   My bibliography  Save this article

Stabilizing the lasso against cross-validation variability

Author

Listed:
  • Roberts, S.
  • Nowak, G.

Abstract

An abundance of high-dimensional data has meant that L1 penalized regression, known as the lasso, has become an indispensable tool of the practitioner. A feature of the lasso is a “tuning” parameter that controls the amount of shrinkage applied to the coefficients. In practice, a value for the tuning parameter is chosen using the method of cross-validation. It is shown that the model that is selected by the lasso can be extremely sensitive to the fold assignment used for cross-validation. A consequence of this sensitivity is that the results from a lasso analysis can lack interpretability. To overcome this model-selection instability of the lasso, a method called the percentile-lasso is introduced. The model selected by the percentile-lasso corresponds to the model selected by the lasso, when the lasso is fitted using an appropriate percentile of the possible “optimal” tuning parameter values. It is demonstrated that the percentile-lasso can achieve substantial improvements in both model-selection stability and model-selection error compared to the lasso. Importantly, when applied to real data the percentile-lasso, unlike the lasso, produces interpretable results, that is, results that are robust to the assignment of observations to folds for cross-validation. The percentile-lasso is easily applied to extensions of the lasso and in the context of penalized generalized linear models.

Suggested Citation

  • Roberts, S. & Nowak, G., 2014. "Stabilizing the lasso against cross-validation variability," Computational Statistics & Data Analysis, Elsevier, vol. 70(C), pages 198-211.
  • Handle: RePEc:eee:csdana:v:70:y:2014:i:c:p:198-211
    DOI: 10.1016/j.csda.2013.09.008
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S016794731300323X
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2013.09.008?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Nicolai Meinshausen & Peter Bühlmann, 2010. "Stability selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(4), pages 417-473, September.
    3. Meinshausen, Nicolai, 2007. "Relaxed Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 374-393, September.
    4. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    5. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    6. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    7. Wang, Xiaoming & Park, Taesung & Carriere, K.C., 2010. "Variable selection via combined penalization for high-dimensional data analysis," Computational Statistics & Data Analysis, Elsevier, vol. 54(10), pages 2230-2243, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pedro Delicado & Philippe Vieu, 2017. "Choosing the most relevant level sets for depicting a sample of densities," Computational Statistics, Springer, vol. 32(3), pages 1083-1113, September.
    2. Mohamed Ouhourane & Yi Yang & Andréa L. Benedet & Karim Oualkacha, 2022. "Group penalized quantile regression," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 31(3), pages 495-529, September.
    3. Emma Saulnier & Olivier Gascuel & Samuel Alizon, 2017. "Inferring epidemiological parameters from phylogenies using regression-ABC: A comparative study," PLOS Computational Biology, Public Library of Science, vol. 13(3), pages 1-31, March.
    4. Zhao, Xin & Barber, Stuart & Taylor, Charles C. & Milan, Zoka, 2018. "Classification tree methods for panel data using wavelet-transformed time series," Computational Statistics & Data Analysis, Elsevier, vol. 127(C), pages 204-216.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    2. Capanu, Marinela & Giurcanu, Mihai & Begg, Colin B. & Gönen, Mithat, 2023. "Subsampling based variable selection for generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 184(C).
    3. Qin, Yichen & Wang, Linna & Li, Yang & Li, Rong, 2023. "Visualization and assessment of model selection uncertainty," Computational Statistics & Data Analysis, Elsevier, vol. 178(C).
    4. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    5. Christopher J Greenwood & George J Youssef & Primrose Letcher & Jacqui A Macdonald & Lauryn J Hagg & Ann Sanson & Jenn Mcintosh & Delyse M Hutchinson & John W Toumbourou & Matthew Fuller-Tyszkiewicz &, 2020. "A comparison of penalised regression methods for informing the selection of predictive markers," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-14, November.
    6. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
    7. Tomáš Plíhal, 2021. "Scheduled macroeconomic news announcements and Forex volatility forecasting," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 40(8), pages 1379-1397, December.
    8. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    9. Zeyu Bian & Erica E. M. Moodie & Susan M. Shortreed & Sahir Bhatnagar, 2023. "Variable selection in regression‐based estimation of dynamic treatment regimes," Biometrics, The International Biometric Society, vol. 79(2), pages 988-999, June.
    10. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    11. Dumitrescu, Elena & Hué, Sullivan & Hurlin, Christophe & Tokpavi, Sessi, 2022. "Machine learning for credit scoring: Improving logistic regression with non-linear decision-tree effects," European Journal of Operational Research, Elsevier, vol. 297(3), pages 1178-1192.
    12. Holger Breinlich & Valentina Corradi & Nadia Rocha & Michele Ruta & Joao M.C. Santos Silva & Tom Zylkin, 2021. "Machine Learning in International Trade Research ?- Evaluating the Impact of Trade Agreements," School of Economics Discussion Papers 0521, School of Economics, University of Surrey.
    13. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    14. Zakariya Yahya Algamal & Muhammad Hisyam Lee, 2019. "A two-stage sparse logistic regression for optimal gene selection in high-dimensional microarray data classification," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 13(3), pages 753-771, September.
    15. Dai, Linlin & Chen, Kani & Sun, Zhihua & Liu, Zhenqiu & Li, Gang, 2018. "Broken adaptive ridge regression and its asymptotic properties," Journal of Multivariate Analysis, Elsevier, vol. 168(C), pages 334-351.
    16. Liang, Lixing & Zhuang, Yipeng & Yu, Philip L.H., 2024. "Variable selection for high-dimensional incomplete data," Computational Statistics & Data Analysis, Elsevier, vol. 192(C).
    17. Posch, Konstantin & Arbeiter, Maximilian & Pilz, Juergen, 2020. "A novel Bayesian approach for variable selection in linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    18. Tanin Sirimongkolkasem & Reza Drikvandi, 2019. "On Regularisation Methods for Analysis of High Dimensional Data," Annals of Data Science, Springer, vol. 6(4), pages 737-763, December.
    19. Fang, Xiaolei & Paynabar, Kamran & Gebraeel, Nagi, 2017. "Multistream sensor fusion-based prognostics model for systems with single failure modes," Reliability Engineering and System Safety, Elsevier, vol. 159(C), pages 322-331.
    20. Hui Xiao & Yiguo Sun, 2019. "On Tuning Parameter Selection in Model Selection and Model Averaging: A Monte Carlo Study," JRFM, MDPI, vol. 12(3), pages 1-16, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:70:y:2014:i:c:p:198-211. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.