IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v153y2021ics0167947320301560.html
   My bibliography  Save this article

Robust boosting for regression problems

Author

Listed:
  • Ju, Xiaomeng
  • Salibián-Barrera, Matías

Abstract

Gradient boosting algorithms construct a regression predictor using a linear combination of “base learners”. Boosting also offers an approach to obtaining robust non-parametric regression estimators that are scalable to applications with many explanatory variables. The robust boosting algorithm is based on a two-stage approach, similar to what is done for robust linear regression: it first minimizes a robust residual scale estimator, and then improves it by optimizing a bounded loss function. Unlike previous robust boosting proposals this approach does not require computing an ad hoc residual scale estimator in each boosting iteration. Since the loss functions involved in this robust boosting algorithm are typically non-convex, a reliable initialization step is required, such as an L1 regression tree, which is also fast to compute. A robust variable importance measure can also be calculated via a permutation procedure. Thorough simulation studies and several data analyses show that, when no atypical observations are present, the robust boosting approach works as well as the standard gradient boosting with a squared loss. Furthermore, when the data contain outliers, the robust boosting estimator outperforms the alternatives in terms of prediction error and variable selection accuracy.

Suggested Citation

  • Ju, Xiaomeng & Salibián-Barrera, Matías, 2021. "Robust boosting for regression problems," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
  • Handle: RePEc:eee:csdana:v:153:y:2021:i:c:s0167947320301560
    DOI: 10.1016/j.csda.2020.107065
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947320301560
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2020.107065?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Buhlmann P. & Yu B., 2003. "Boosting With the L2 Loss: Regression and Classification," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 324-339, January.
    2. Marie-Hélène Roy & Denis Larocque, 2012. "Robustness of random forests for regression," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(4), pages 993-1006, December.
    3. Boente, Graciela & Fraiman, Ricardo, 1989. "Robust nonparametric regression estimation," Journal of Multivariate Analysis, Elsevier, vol. 29(2), pages 180-198, May.
    4. Graciela Boente & Alejandra Martínez & Matías Salibián-Barrera, 2017. "Robust estimators for additive models using backfitting," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 29(4), pages 744-767, October.
    5. Lutz, Roman Werner & Kalisch, Markus & Buhlmann, Peter, 2008. "Robustified L2 boosting," Computational Statistics & Data Analysis, Elsevier, vol. 52(7), pages 3331-3341, March.
    6. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    7. Alexander Hanbo Li & Jelena Bradic, 2018. "Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(522), pages 660-674, April.
    8. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Stefanie Hieke & Axel Benner & Richard F Schlenk & Martin Schumacher & Lars Bullinger & Harald Binder, 2016. "Identifying Prognostic SNPs in Clinical Cohorts: Complementing Univariate Analyses by Resampling and Multivariable Modeling," PLOS ONE, Public Library of Science, vol. 11(5), pages 1-18, May.
    3. Sariyar Murat & Schumacher Martin & Binder Harald, 2014. "A boosting approach for adapting the sparsity of risk prediction signatures based on different molecular levels," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 13(3), pages 343-357, June.
    4. Tutz, Gerhard & Binder, Harald, 2007. "Boosting ridge regression," Computational Statistics & Data Analysis, Elsevier, vol. 51(12), pages 6044-6059, August.
    5. Tan, Xueping & Sirichand, Kavita & Vivian, Andrew & Wang, Xinyu, 2022. "Forecasting European carbon returns using dimension reduction techniques: Commodity versus financial fundamentals," International Journal of Forecasting, Elsevier, vol. 38(3), pages 944-969.
    6. Wang Zhu & Wang C.Y., 2010. "Buckley-James Boosting for Survival Analysis with High-Dimensional Biomarker Data," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 9(1), pages 1-33, June.
    7. Kim, Hyun Hak & Swanson, Norman R., 2014. "Forecasting financial and macroeconomic variables using data reduction methods: New empirical evidence," Journal of Econometrics, Elsevier, vol. 178(P2), pages 352-367.
    8. Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
    9. Carstensen, Kai & Heinrich, Markus & Reif, Magnus & Wolters, Maik H., 2020. "Predicting ordinary and severe recessions with a three-state Markov-switching dynamic factor model," International Journal of Forecasting, Elsevier, vol. 36(3), pages 829-850.
    10. Hou-Tai Chang & Ping-Huai Wang & Wei-Fang Chen & Chen-Ju Lin, 2022. "Risk Assessment of Early Lung Cancer with LDCT and Health Examinations," IJERPH, MDPI, vol. 19(8), pages 1-12, April.
    11. Wang, Qiao & Zhou, Wei & Cheng, Yonggang & Ma, Gang & Chang, Xiaolin & Miao, Yu & Chen, E, 2018. "Regularized moving least-square method and regularized improved interpolating moving least-square method with nonsingular moment matrices," Applied Mathematics and Computation, Elsevier, vol. 325(C), pages 120-145.
    12. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    13. Lucian Belascu & Alexandra Horobet & Georgiana Vrinceanu & Consuela Popescu, 2021. "Performance Dissimilarities in European Union Manufacturing: The Effect of Ownership and Technological Intensity," Sustainability, MDPI, vol. 13(18), pages 1-19, September.
    14. Candelon, B. & Hurlin, C. & Tokpavi, S., 2012. "Sampling error and double shrinkage estimation of minimum variance portfolios," Journal of Empirical Finance, Elsevier, vol. 19(4), pages 511-527.
    15. Andrea Carriero & Todd E. Clark & Massimiliano Marcellino, 2022. "Specification Choices in Quantile Regression for Empirical Macroeconomics," Working Papers 22-25, Federal Reserve Bank of Cleveland.
    16. Kim, Hyun Hak & Swanson, Norman R., 2018. "Mining big data using parsimonious factor, machine learning, variable selection and shrinkage methods," International Journal of Forecasting, Elsevier, vol. 34(2), pages 339-354.
    17. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.
    18. Chuliá, Helena & Garrón, Ignacio & Uribe, Jorge M., 2024. "Daily growth at risk: Financial or real drivers? The answer is not always the same," International Journal of Forecasting, Elsevier, vol. 40(2), pages 762-776.
    19. Enrico Bergamini & Georg Zachmann, 2020. "Exploring EU’s Regional Potential in Low-Carbon Technologies," Sustainability, MDPI, vol. 13(1), pages 1-28, December.
    20. Qianyun Li & Runmin Shi & Faming Liang, 2019. "Drug sensitivity prediction with high-dimensional mixture regression," PLOS ONE, Public Library of Science, vol. 14(2), pages 1-18, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:153:y:2021:i:c:s0167947320301560. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.