IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v101y2010i10p2499-2518.html
   My bibliography  Save this article

On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification

Author

Listed:
  • Biau, Gérard
  • Devroye, Luc

Abstract

Let be identically distributed random vectors in , independently drawn according to some probability density. An observation is said to be a layered nearest neighbour (LNN) of a point if the hyperrectangle defined by and contains no other data points. We first establish consistency results on , the number of LNN of . Then, given a sample of independent identically distributed random vectors from , one may estimate the regression function by the LNN estimate , defined as an average over the Yi's corresponding to those which are LNN of . Under mild conditions on r, we establish the consistency of towards 0 as n-->[infinity], for almost all and all p>=1, and discuss the links between rn and the random forest estimates of Breiman (2001) [8]. We finally show the universal consistency of the bagged (bootstrap-aggregated) nearest neighbour method for regression and classification.

Suggested Citation

  • Biau, Gérard & Devroye, Luc, 2010. "On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification," Journal of Multivariate Analysis, Elsevier, vol. 101(10), pages 2499-2518, November.
  • Handle: RePEc:eee:jmvana:v:101:y:2010:i:10:p:2499-2518
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047-259X(10)00138-7
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lin, Yi & Jeon, Yongho, 2006. "Random Forests and Adaptive Nearest Neighbors," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 578-590, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Uguccioni, James, 2022. "The long-run effects of parental unemployment in childhood," CLEF Working Paper Series 45, Canadian Labour Economics Forum (CLEF), University of Waterloo.
    2. Tung Duy Luu & Jalal Fadili & Christophe Chesneau, 2021. "Sampling from Non-smooth Distributions Through Langevin Diffusion," Methodology and Computing in Applied Probability, Springer, vol. 23(4), pages 1173-1201, December.
    3. Zhexiao Lin & Fang Han, 2022. "On regression-adjusted imputation estimators of the average treatment effect," Papers 2212.05424, arXiv.org, revised Jan 2023.
    4. Biau, Gérard & Devroye, Luc & Dujmović, Vida & Krzyżak, Adam, 2012. "An affine invariant k-nearest neighbor regression estimate," Journal of Multivariate Analysis, Elsevier, vol. 112(C), pages 24-34.
    5. Marie-Hélène Roy & Denis Larocque, 2012. "Robustness of random forests for regression," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(4), pages 993-1006, December.
    6. Gérard Biau & Erwan Scornet, 2016. "A random forest guided tour," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 25(2), pages 197-227, June.
    7. Luu, Tung Duy & Fadili, Jalal & Chesneau, Christophe, 2019. "PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 209-233.
    8. Mendez, Guillermo & Lohr, Sharon, 2011. "Estimating residual variance in random forest regression," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2937-2950, November.
    9. Mathlouthi, Walid & Fredette, Marc & Larocque, Denis, 2015. "Regression trees and forests for non-homogeneous Poisson processes," Statistics & Probability Letters, Elsevier, vol. 96(C), pages 204-211.
    10. Irfan Ullah & Rehan Ullah Khan & Fan Yang & Lunchakorn Wuttisittikulkij, 2020. "Deep Learning Image-Based Defect Detection in High Voltage Electrical Equipment," Energies, MDPI, vol. 13(2), pages 1-17, January.
    11. Timothy I. Cannings & Richard J. Samworth, 2017. "Random-projection ensemble classification," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 959-1035, September.
    12. Susan Athey & Julie Tibshirani & Stefan Wager, 2016. "Generalized Random Forests," Papers 1610.01271, arXiv.org, revised Apr 2018.
    13. Guoyi Zhang & Yan Lu, 2012. "Bias-corrected random forests in regression," Journal of Applied Statistics, Taylor & Francis Journals, vol. 39(1), pages 151-160, March.
    14. Paola Zuccolotto & Marco Sandri & Marica Manisera, 2023. "Spatial performance analysis in basketball with CART, random forest and extremely randomized trees," Annals of Operations Research, Springer, vol. 325(1), pages 495-519, June.
    15. Jincheng Shen & Lu Wang & Jeremy M. G. Taylor, 2017. "Estimation of the optimal regime in treatment of prostate cancer recurrence from observational data using flexible weighting models," Biometrics, The International Biometric Society, vol. 73(2), pages 635-645, June.
    16. Kudraszow, Nadia L. & Vieu, Philippe, 2013. "Uniform consistency of kNN regressors for functional variables," Statistics & Probability Letters, Elsevier, vol. 83(8), pages 1863-1870.
    17. Ramosaj, Burim & Pauly, Markus, 2019. "Consistent estimation of residual variance with random forest Out-Of-Bag errors," Statistics & Probability Letters, Elsevier, vol. 151(C), pages 49-57.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jerinsh Jeyapaulraj & Dhruv Desai & Peter Chu & Dhagash Mehta & Stefano Pasquali & Philip Sommer, 2022. "Supervised similarity learning for corporate bonds using Random Forest proximities," Papers 2207.04368, arXiv.org, revised Oct 2022.
    2. David M. Ritzwoller & Vasilis Syrgkanis, 2024. "Simultaneous Inference for Local Structural Parameters with Random Forests," Papers 2405.07860, arXiv.org, revised Sep 2024.
    3. Mendez, Guillermo & Lohr, Sharon, 2011. "Estimating residual variance in random forest regression," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2937-2950, November.
    4. Li, Yiliang & Bai, Xiwen & Wang, Qi & Ma, Zhongjun, 2022. "A big data approach to cargo type prediction and its implications for oil trade estimation," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 165(C).
    5. Yi Fu & Shuai Cao & Tao Pang, 2020. "A Sustainable Quantitative Stock Selection Strategy Based on Dynamic Factor Adjustment," Sustainability, MDPI, vol. 12(10), pages 1-12, May.
    6. José María Sarabia & Faustino Prieto & Vanesa Jordá & Stefan Sperlich, 2020. "A Note on Combining Machine Learning with Statistical Modeling for Financial Data Analysis," Risks, MDPI, vol. 8(2), pages 1-14, April.
    7. Olivier BIAU & Angela D´ELIA, 2010. "Euro Area GDP Forecast Using Large Survey Dataset - A Random Forest Approach," EcoMod2010 259600029, EcoMod.
    8. Cleridy E. Lennert‐Cody & Richard A. Berk, 2007. "Statistical learning procedures for monitoring regulatory compliance: an application to fisheries data," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 170(3), pages 671-689, July.
    9. Susan Athey & Julie Tibshirani & Stefan Wager, 2016. "Generalized Random Forests," Papers 1610.01271, arXiv.org, revised Apr 2018.
    10. Philippe Goulet Coulombe, 2024. "The macroeconomy as a random forest," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 39(3), pages 401-421, April.
    11. Jincheng Shen & Lu Wang & Jeremy M. G. Taylor, 2017. "Estimation of the optimal regime in treatment of prostate cancer recurrence from observational data using flexible weighting models," Biometrics, The International Biometric Society, vol. 73(2), pages 635-645, June.
    12. Dhruv Desai & Ashmita Dhiman & Tushar Sharma & Deepika Sharma & Dhagash Mehta & Stefano Pasquali, 2023. "Quantifying Outlierness of Funds from their Categories using Supervised Similarity," Papers 2308.06882, arXiv.org.
    13. Hoora Moradian & Denis Larocque & François Bellavance, 2017. "$$L_1$$ L 1 splitting rules in survival forests," Lifetime Data Analysis: An International Journal Devoted to Statistical Methods and Applications for Time-to-Event Data, Springer, vol. 23(4), pages 671-691, October.
    14. Arlen Dean & Amirhossein Meisami & Henry Lam & Mark P. Van Oyen & Christopher Stromblad & Nick Kastango, 2022. "Quantile regression forests for individualized surgery scheduling," Health Care Management Science, Springer, vol. 25(4), pages 682-709, December.
    15. Mingshu Li & Bhaskarjit Sarmah & Dhruv Desai & Joshua Rosaler & Snigdha Bhagat & Philip Sommer & Dhagash Mehta, 2024. "Quantile Regression using Random Forest Proximities," Papers 2408.02355, arXiv.org.
    16. Lundberg, Ian & Brand, Jennie E. & Jeon, Nanum, 2022. "Researcher reasoning meets computational capacity: Machine learning for social science," SocArXiv s5zc8, Center for Open Science.
    17. Yang, Bill Huajian, 2013. "Modeling Portfolio Risk by Risk Discriminatory Trees and Random Forests," MPRA Paper 57245, University Library of Munich, Germany.
    18. Yifei Sun & Sy Han Chiou & Mei‐Cheng Wang, 2020. "ROC‐guided survival trees and ensembles," Biometrics, The International Biometric Society, vol. 76(4), pages 1177-1189, December.
    19. Charles B. Perkins & J. Christina Wang, 2019. "How Magic a Bullet Is Machine Learning for Credit Analysis? An Exploration with FinTech Lending Data," Working Papers 19-16, Federal Reserve Bank of Boston.
    20. Borup, Daniel & Christensen, Bent Jesper & Mühlbach, Nicolaj Søndergaard & Nielsen, Mikkel Slot, 2023. "Targeting predictors in random forest regression," International Journal of Forecasting, Elsevier, vol. 39(2), pages 841-868.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:101:y:2010:i:10:p:2499-2518. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.