IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v169y2022ics0167947321002498.html
   My bibliography  Save this article

Robust subset selection

Author

Listed:
  • Thompson, Ryan

Abstract

The best subset selection (or “best subsets”) estimator is a classic tool for sparse regression, and developments in mathematical optimization over the past decade have made it more computationally tractable than ever. Notwithstanding its desirable statistical properties, the best subsets estimator is susceptible to outliers and can break down in the presence of a single contaminated data point. To address this issue, a robust adaption of best subsets is proposed that is highly resistant to contamination in both the response and the predictors. The adapted estimator generalizes the notion of subset selection to both predictors and observations, thereby achieving robustness in addition to sparsity. This procedure, referred to as “robust subset selection” (or “robust subsets”), is defined by a combinatorial optimization problem for which modern discrete optimization methods are applied. The robustness of the estimator in terms of the finite-sample breakdown point of its objective value is formally established. In support of this result, experiments on synthetic and real data are reported that demonstrate the superiority of robust subsets over best subsets in the presence of contamination. Importantly, robust subsets fares competitively across several metrics compared with popular robust adaptions of continuous shrinkage estimators.

Suggested Citation

  • Thompson, Ryan, 2022. "Robust subset selection," Computational Statistics & Data Analysis, Elsevier, vol. 169(C).
  • Handle: RePEc:eee:csdana:v:169:y:2022:i:c:s0167947321002498
    DOI: 10.1016/j.csda.2021.107415
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947321002498
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2021.107415?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yuichi Takano & Ryuhei Miyashiro, 2020. "Best subset selection via cross-validation criterion," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 28(2), pages 475-488, July.
    2. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    3. She, Yiyuan & Owen, Art B., 2011. "Outlier Detection Using Nonconvex Penalized Regression," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 626-639.
    4. Hofmann, Marc & Gatu, Cristian & Kontoghiorghes, Erricos John, 2007. "Efficient algorithms for computing the best subset regression models for large-scale problems," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 16-29, September.
    5. Khan, Jafar A. & Van Aelst, Stefan & Zamar, Ruben H., 2007. "Building a robust linear model with forward selection and stepwise procedures," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 239-248, September.
    6. McCann, Lauren & Welsch, Roy E., 2007. "Robust variable selection using least angle regression and elemental set sampling," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 249-257, September.
    7. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    8. Smucler, Ezequiel & Yohai, Victor J., 2017. "Robust and sparse estimators for linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 116-130.
    9. Menjoge, Rajiv S. & Welsch, Roy E., 2010. "A diagnostic method for simultaneous feature selection and outlier identification in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3181-3193, December.
    10. M. J. Garside, 1965. "The Best Sub‐Set in Multiple Regression Analysis," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 14(2-3), pages 196-200, November.
    11. Wang, Hansheng & Li, Guodong & Jiang, Guohua, 2007. "Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso," Journal of Business & Economic Statistics, American Statistical Association, vol. 25, pages 347-355, July.
    12. Xiaotong Shen & Wei Pan & Yunzhang Zhu & Hui Zhou, 2013. "On constrained and regularized high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(5), pages 807-832, October.
    13. G. Zioutas & L. Pitsoulis & A. Avramidis, 2009. "Quadratic mixed integer programming and support vectors for deleting outliers in robust regression," Annals of Operations Research, Springer, vol. 166(1), pages 339-353, February.
    14. Khan, Jafar A. & Van Aelst, Stefan & Zamar, Ruben H., 2007. "Robust Linear Model Selection Based on Least Angle Regression," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1289-1299, December.
    15. Xueqin Wang & Yunlu Jiang & Mian Huang & Heping Zhang, 2013. "Robust Variable Selection With Exponential Squared Loss," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(502), pages 632-643, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Barbato, Michele & Ceselli, Alberto, 2024. "Mathematical programming for simultaneous feature selection and outlier detection under l1 norm," European Journal of Operational Research, Elsevier, vol. 316(3), pages 1070-1084.
    2. Zhan Gao & Hyungsik Roger Moon, 2024. "Robust Estimation of Regression Models with Potentially Endogenous Outliers via a Modern Optimization Lens," Papers 2408.03930, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    2. Luca Insolia & Ana Kenney & Francesca Chiaromonte & Giovanni Felici, 2022. "Simultaneous feature selection and outlier detection with optimality guarantees," Biometrics, The International Biometric Society, vol. 78(4), pages 1592-1603, December.
    3. Smucler, Ezequiel & Yohai, Victor J., 2017. "Robust and sparse estimators for linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 116-130.
    4. Khan, Jafar A. & Van Aelst, Stefan & Zamar, Ruben H., 2010. "Fast robust estimation of prediction error based on resampling," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3121-3130, December.
    5. Gijbels, I. & Vrinssen, I., 2015. "Robust nonnegative garrote variable selection in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 85(C), pages 1-22.
    6. Tianxiang Liu & Ting Kei Pong & Akiko Takeda, 2019. "A refined convergence analysis of $$\hbox {pDCA}_{e}$$ pDCA e with applications to simultaneous sparse recovery and outlier detection," Computational Optimization and Applications, Springer, vol. 73(1), pages 69-100, May.
    7. Farnè, Matteo & Vouldis, Angelos T., 2018. "A methodology for automised outlier detection in high-dimensional datasets: an application to euro area banks' supervisory data," Working Paper Series 2171, European Central Bank.
    8. Mingqiu Wang & Guo-Liang Tian, 2016. "Robust group non-convex estimations for high-dimensional partially linear models," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(1), pages 49-67, March.
    9. Junlong Zhao & Chao Liu & Lu Niu & Chenlei Leng, 2019. "Multiple influential point detection in high dimensional regression spaces," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(2), pages 385-408, April.
    10. Z. John Daye & Jinbo Chen & Hongzhe Li, 2012. "High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis," Biometrics, The International Biometric Society, vol. 68(1), pages 316-326, March.
    11. Su, Peng & Tarr, Garth & Muller, Samuel & Wang, Suojin, 2024. "CR-Lasso: Robust cellwise regularized sparse regression," Computational Statistics & Data Analysis, Elsevier, vol. 197(C).
    12. Song, Yunquan & Liang, Xijun & Zhu, Yanji & Lin, Lu, 2021. "Robust variable selection with exponential squared loss for the spatial autoregressive model," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    13. Riani, Marco & Atkinson, Anthony C., 2010. "Robust model selection with flexible trimming," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3300-3312, December.
    14. Weiyan Mu & Shifeng Xiong, 2014. "Some notes on robust sure independence screening," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(10), pages 2092-2102, October.
    15. Elvezio Ronchetti, 2021. "The main contributions of robust statistics to statistical science and a new challenge," METRON, Springer;Sapienza Università di Roma, vol. 79(2), pages 127-135, August.
    16. Kepplinger, David, 2023. "Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 183(C).
    17. Luca Insolia & Ana Kenney & Martina Calovi & Francesca Chiaromonte, 2021. "Robust Variable Selection with Optimality Guarantees for High-Dimensional Logistic Regression," Stats, MDPI, vol. 4(3), pages 1-17, August.
    18. Qingguo Tang & R. J. Karunamuni, 2018. "Robust variable selection for finite mixture regression models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(3), pages 489-521, June.
    19. N. Neykov & P. Filzmoser & P. Neytchev, 2014. "Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator," Statistical Papers, Springer, vol. 55(1), pages 187-207, February.
    20. Salibian-Barrera, Matias & Van Aelst, Stefan, 2008. "Robust model selection using fast and robust bootstrap," Computational Statistics & Data Analysis, Elsevier, vol. 52(12), pages 5121-5135, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:169:y:2022:i:c:s0167947321002498. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.