IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i24p4957-d1300387.html
   My bibliography  Save this article

On the Oracle Properties of Bayesian Random Forest for Sparse High-Dimensional Gaussian Regression

Author

Listed:
  • Oyebayo Ridwan Olaniran

    (Department of Statistics, Faculty of Physical Sciences, University of Ilorin, llorin 240101, Nigeria
    These authors contributed equally to this work.)

  • Ali Rashash R. Alzahrani

    (Mathematics Department, Faculty of Sciences, Umm Al-Qura University, Makkah 24382, Saudi Arabia
    These authors contributed equally to this work.)

Abstract

Random forest (RF) is a widely used data prediction and variable selection technique. However, the variable selection aspect of RF can become unreliable when there are more irrelevant variables than relevant ones. In response, we introduced the Bayesian random forest (BRF) method, specifically designed for high-dimensional datasets with a sparse covariate structure. Our research demonstrates that BRF possesses the oracle property, which means it achieves strong selection consistency without compromising the efficiency or bias.

Suggested Citation

  • Oyebayo Ridwan Olaniran & Ali Rashash R. Alzahrani, 2023. "On the Oracle Properties of Bayesian Random Forest for Sparse High-Dimensional Gaussian Regression," Mathematics, MDPI, vol. 11(24), pages 1-29, December.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:24:p:4957-:d:1300387
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/24/4957/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/24/4957/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Todd E. Clark & Florian Huber & Gary Koop & Massimiliano Marcellino & Michael Pfarrhofer, 2023. "Tail Forecasting With Multivariate Bayesian Additive Regression Trees," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 64(3), pages 979-1022, August.
    3. Kyoungmi Hwang & Dohyun Kim & Kyungsik Lee & Chungmok Lee & Sungsoo Park, 2017. "Embedded variable selection method using signomial classification," Annals of Operations Research, Springer, vol. 254(1), pages 89-109, July.
    4. Qiang Wang & Thanh-Tung Nguyen & Joshua Z. Huang & Thuy Thi Nguyen, 2018. "An efficient random forests algorithm for high dimensional data classification," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(4), pages 953-972, December.
    5. Taddy, Matthew A. & Gramacy, Robert B. & Polson, Nicholas G., 2011. "Dynamic Trees for Learning and Design," Journal of the American Statistical Association, American Statistical Association, vol. 106(493), pages 109-123.
    6. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    7. Antonio R. Linero, 2018. "Bayesian Regression Trees for High-Dimensional Prediction and Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(522), pages 626-636, April.
    8. Brett Presnell, 2022. "A Geometric Derivation of the Cantor Distribution," The American Statistician, Taylor & Francis Journals, vol. 76(1), pages 73-77, January.
    9. Silke Janitza & Ender Celik & Anne-Laure Boulesteix, 2018. "A computationally fast variable importance test for random forests for high-dimensional data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(4), pages 885-915, December.
    10. Kyoungmi Hwang & Kyungsik Lee & Sungsoo Park, 2017. "Variable selection methods for multi-class classification using signomial function," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 68(9), pages 1117-1130, September.
    11. Antonio R. Linero & Yun Yang, 2018. "Bayesian regression tree ensembles that adapt to smoothness and sparsity," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(5), pages 1087-1110, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Oyebayo Ridwan Olaniran & Aliu Omotayo Sikiru & Jeza Allohibi & Abdulmajeed Atiah Alharbi & Nada MohammedSaeed Alharbi, 2025. "Hybrid Random Feature Selection and Recurrent Neural Network for Diabetes Prediction," Mathematics, MDPI, vol. 13(4), pages 1-25, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yi Liu & Veronika Ročková & Yuexi Wang, 2021. "Variable selection with ABC Bayesian forests," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 453-481, July.
    2. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    3. Xu, Yang & Zhao, Shishun & Hu, Tao & Sun, Jianguo, 2021. "Variable selection for generalized odds rate mixture cure models with interval-censored failure time data," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    4. Emmanouil Androulakis & Christos Koukouvinos & Kalliopi Mylona & Filia Vonta, 2010. "A real survival analysis application via variable selection methods for Cox's proportional hazards model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 37(8), pages 1399-1406.
    5. Jun Zhu & Hsin‐Cheng Huang & Perla E. Reyes, 2010. "On selection of spatial linear models for lattice data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(3), pages 389-402, June.
    6. Lam, Clifford, 2008. "Estimation of large precision matrices through block penalization," LSE Research Online Documents on Economics 31543, London School of Economics and Political Science, LSE Library.
    7. Ping Wu & Xinchao Luo & Peirong Xu & Lixing Zhu, 2017. "New variable selection for linear mixed-effects models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 69(3), pages 627-646, June.
    8. Naimoli, Antonio, 2022. "Modelling the persistence of Covid-19 positivity rate in Italy," Socio-Economic Planning Sciences, Elsevier, vol. 82(PA).
    9. Xia Chen & Liyue Mao, 2020. "Penalized empirical likelihood for partially linear errors-in-variables models," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 104(4), pages 597-623, December.
    10. Jian Guo & Elizaveta Levina & George Michailidis & Ji Zhu, 2010. "Pairwise Variable Selection for High-Dimensional Model-Based Clustering," Biometrics, The International Biometric Society, vol. 66(3), pages 793-804, September.
    11. Xiaotong Shen & Wei Pan & Yunzhang Zhu & Hui Zhou, 2013. "On constrained and regularized high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(5), pages 807-832, October.
    12. Oguzhan Cepni & I. Ethem Guney & Norman R. Swanson, 2020. "Forecasting and nowcasting emerging market GDP growth rates: The role of latent global economic policy uncertainty and macroeconomic data surprise factors," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 39(1), pages 18-36, January.
    13. repec:hum:wpaper:sfb649dp2016-047 is not listed on IDEAS
    14. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    15. Tizheng Li & Xiaojuan Kang, 2022. "Variable selection of higher-order partially linear spatial autoregressive model with a diverging number of parameters," Statistical Papers, Springer, vol. 63(1), pages 243-285, February.
    16. Alexandre Belloni & Victor Chernozhukov & Ivan Fernandez-Val & Christian Hansen, 2013. "Program evaluation with high-dimensional data," CeMMAP working papers CWP77/13, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    17. Shan Luo & Zehua Chen, 2014. "Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1229-1240, September.
    18. Caner, Mehmet & Fan, Qingliang, 2015. "Hybrid generalized empirical likelihood estimators: Instrument selection with adaptive lasso," Journal of Econometrics, Elsevier, vol. 187(1), pages 256-274.
    19. Shi Chen & Wolfgang Karl Hardle & Brenda L'opez Cabrera, 2020. "Regularization Approach for Network Modeling of German Power Derivative Market," Papers 2009.09739, arXiv.org.
    20. Wang, Christina Dan & Chen, Zhao & Lian, Yimin & Chen, Min, 2022. "Asset selection based on high frequency Sharpe ratio," Journal of Econometrics, Elsevier, vol. 227(1), pages 168-188.
    21. Yanlin Tang & Liya Xiang & Zhongyi Zhu, 2014. "Risk Factor Selection in Rate Making: EM Adaptive LASSO for Zero‐Inflated Poisson Regression Models," Risk Analysis, John Wiley & Sons, vol. 34(6), pages 1112-1127, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:24:p:4957-:d:1300387. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.