IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v85y2015icp1-22.html
   My bibliography  Save this article

Robust nonnegative garrote variable selection in linear regression

Author

Listed:
  • Gijbels, I.
  • Vrinssen, I.

Abstract

Robust selection of variables in a linear regression model is investigated. Many variable selection methods are available, but very few methods are designed to avoid sensitivity to vertical outliers as well as to leverage points. The nonnegative garrote method is a powerful variable selection method, developed originally for linear regression but recently successfully extended to more complex regression models. The method has good performances and its theoretical properties have been established. The aim is to robustify the nonnegative garrote method for linear regression as to make it robust to vertical outliers and leverage points. Several approaches are discussed, and recommendations towards a final good performing robust nonnegative garrote method are given. The proposed method is evaluated via a simulation study that also includes a comparison with existing methods. The method performs very well, and often outperforms existing methods. A real data application illustrates the use of the method in practice.

Suggested Citation

  • Gijbels, I. & Vrinssen, I., 2015. "Robust nonnegative garrote variable selection in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 85(C), pages 1-22.
  • Handle: RePEc:eee:csdana:v:85:y:2015:i:c:p:1-22
    DOI: 10.1016/j.csda.2014.11.009
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947314003326
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2014.11.009?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lan Wang & Runze Li, 2009. "Weighted Wilcoxon-Type Smoothly Clipped Absolute Deviation Method," Biometrics, The International Biometric Society, vol. 65(2), pages 564-571, June.
    2. Arslan, Olcay, 2012. "Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1952-1965.
    3. Wang, Hansheng & Li, Guodong & Jiang, Guohua, 2007. "Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso," Journal of Business & Economic Statistics, American Statistical Association, vol. 25, pages 347-355, July.
    4. Ming Yuan & Yi Lin, 2007. "On the non‐negative garrotte estimator," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(2), pages 143-161, April.
    5. Khan, Jafar A. & Van Aelst, Stefan & Zamar, Ruben H., 2007. "Robust Linear Model Selection Based on Least Angle Regression," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1289-1299, December.
    6. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    7. Xueqin Wang & Yunlu Jiang & Mian Huang & Heping Zhang, 2013. "Robust Variable Selection With Exponential Squared Loss," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(502), pages 632-643, June.
    8. Khan, Jafar A. & Van Aelst, Stefan & Zamar, Ruben H., 2010. "Fast robust estimation of prediction error based on resampling," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3121-3130, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    2. Yang Peng & Bin Luo & Xiaoli Gao, 2022. "Robust Moderately Clipped LASSO for Simultaneous Outlier Detection and Variable Selection," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 84(2), pages 694-707, November.
    3. Kepplinger, David, 2023. "Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 183(C).
    4. Yongshi Liu & Xiaodong Yu & Jianjun Zhao & Changchun Pan & Kai Sun, 2022. "Development of a Robust Data-Driven Soft Sensor for Multivariate Industrial Processes with Non-Gaussian Noise and Outliers," Mathematics, MDPI, vol. 10(20), pages 1-16, October.
    5. Mingqiu Wang & Guo-Liang Tian, 2016. "Robust group non-convex estimations for high-dimensional partially linear models," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(1), pages 49-67, March.
    6. Rand R. Wilcox, 2018. "Robust regression: an inferential method for determining which independent variables are most important," Journal of Applied Statistics, Taylor & Francis Journals, vol. 45(1), pages 100-111, January.
    7. Mat Daut, Mohammad Azhar & Hassan, Mohammad Yusri & Abdullah, Hayati & Rahman, Hasimah Abdul & Abdullah, Md Pauzi & Hussin, Faridah, 2017. "Building electrical energy consumption forecasting analysis using conventional and artificial intelligence methods: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 70(C), pages 1108-1118.
    8. Smucler, Ezequiel & Yohai, Victor J., 2017. "Robust and sparse estimators for linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 116-130.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Smucler, Ezequiel & Yohai, Victor J., 2017. "Robust and sparse estimators for linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 116-130.
    2. Mingqiu Wang & Guo-Liang Tian, 2016. "Robust group non-convex estimations for high-dimensional partially linear models," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(1), pages 49-67, March.
    3. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    4. Thompson, Ryan, 2022. "Robust subset selection," Computational Statistics & Data Analysis, Elsevier, vol. 169(C).
    5. Yeşim Güney & Yetkin Tuaç & Şenay Özdemir & Olcay Arslan, 2021. "Robust estimation and variable selection in heteroscedastic regression model using least favorable distribution," Computational Statistics, Springer, vol. 36(2), pages 805-827, June.
    6. Qingguo Tang & R. J. Karunamuni, 2018. "Robust variable selection for finite mixture regression models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(3), pages 489-521, June.
    7. Liya Fu & Zhuoran Yang & Fengjing Cai & You-Gan Wang, 2021. "Efficient and doubly-robust methods for variable selection and parameter estimation in longitudinal data analysis," Computational Statistics, Springer, vol. 36(2), pages 781-804, June.
    8. N. Neykov & P. Filzmoser & P. Neytchev, 2014. "Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator," Statistical Papers, Springer, vol. 55(1), pages 187-207, February.
    9. Long Feng & Changliang Zou & Zhaojun Wang & Xianwu Wei & Bin Chen, 2015. "Robust spline-based variable selection in varying coefficient model," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 78(1), pages 85-118, January.
    10. Song, Yunquan & Liang, Xijun & Zhu, Yanji & Lin, Lu, 2021. "Robust variable selection with exponential squared loss for the spatial autoregressive model," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    11. Qiang Li & Liming Wang, 2020. "Robust change point detection method via adaptive LAD-LASSO," Statistical Papers, Springer, vol. 61(1), pages 109-121, February.
    12. Barbato, Michele & Ceselli, Alberto, 2024. "Mathematical programming for simultaneous feature selection and outlier detection under l1 norm," European Journal of Operational Research, Elsevier, vol. 316(3), pages 1070-1084.
    13. Tianfa Xie & Ruiyuan Cao & Jiang Du, 2020. "Variable selection for spatial autoregressive models with a diverging number of parameters," Statistical Papers, Springer, vol. 61(3), pages 1125-1145, June.
    14. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    15. Kangning Wang & Lu Lin, 2019. "Robust and efficient estimator for simultaneous model structure identification and variable selection in generalized partial linear varying coefficient models with longitudinal data," Statistical Papers, Springer, vol. 60(5), pages 1649-1676, October.
    16. Zhao, Weihua & Lian, Heng, 2017. "Quantile index coefficient model with variable selection," Journal of Multivariate Analysis, Elsevier, vol. 154(C), pages 40-58.
    17. Yafen Ye & Renyong Chi & Yuan-Hai Shao & Chun-Na Li & Xiangyu Hua, 2022. "Indicator Selection of Index Construction by Adaptive Lasso with a Generic $$\varepsilon $$ ε -Insensitive Loss," Computational Economics, Springer;Society for Computational Economics, vol. 60(3), pages 971-990, October.
    18. Weiyan Mu & Shifeng Xiong, 2014. "Some notes on robust sure independence screening," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(10), pages 2092-2102, October.
    19. Zhu Wang, 2022. "MM for penalized estimation," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 54-75, March.
    20. Weichi Wu & Zhou Zhou, 2017. "Nonparametric Inference for Time-Varying Coefficient Quantile Regression," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 35(1), pages 98-109, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:85:y:2015:i:c:p:1-22. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.