IDEAS home Printed from https://ideas.repec.org/a/spr/sankhb/v85y2023i1d10.1007_s13571-023-00301-z.html
   My bibliography  Save this article

Regression Trees and Ensemble for Multivariate Outcomes

Author

Listed:
  • Evan L. Reynolds

    (University of Michigan)

  • Brian C. Callaghan

    (University of Michigan)

  • Michael Gaies

    (University of Cincinnati)

  • Mousumi Banerjee

    (University of Michigan)

Abstract

Tree-based methods have become one of the most flexible, intuitive, and powerful analytic tools for exploring complex data structures. The best documented, and arguably most popular uses of tree-based methods are in biomedical research, where multivariate outcomes occur commonly (e.g. diastolic and systolic blood pressure and nerve conduction measures in studies of neuropathy). Existing tree-based methods for multivariate outcomes do not appropriately take into account the correlation that exists in such data. In this paper, we develop goodness-of-split measures for building multivariate regression trees for continuous multivariate outcomes. We propose two general approaches: minimizing within-node homogeneity and maximizing between-node separation. Within-node homogeneity is measured using the average Mahalanobis distance and the determinant of the variance-covariance matrix. Between-node separation is measured using the Mahalanobis distance, Euclidean distance and standardized Euclidean distance. To enhance prediction accuracy we extend the single multivariate regression tree to an ensemble of multivariate trees. Extensive simulations are presented to examine the properties of our goodness-of-split measures. Finally, the proposed methods are illustrated using two clinical datasets of neuropathy and pediatric cardiac surgery.

Suggested Citation

  • Evan L. Reynolds & Brian C. Callaghan & Michael Gaies & Mousumi Banerjee, 2023. "Regression Trees and Ensemble for Multivariate Outcomes," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 85(1), pages 77-109, May.
  • Handle: RePEc:spr:sankhb:v:85:y:2023:i:1:d:10.1007_s13571-023-00301-z
    DOI: 10.1007/s13571-023-00301-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s13571-023-00301-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s13571-023-00301-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. David R. Larsen & Paul L. Speckman, 2004. "Multivariate Regression Trees for Analysis of Abundance Data," Biometrics, The International Biometric Society, vol. 60(2), pages 543-549, June.
    2. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    3. Jianqing Fan & Yuan Liao & Han Liu, 2016. "An overview of the estimation of large covariance and precision matrices," Econometrics Journal, Royal Economic Society, vol. 19(1), pages 1-32, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Enrico Bernardi & Matteo Farnè, 2022. "A Log-Det Heuristics for Covariance Matrix Estimation: The Analytic Setup," Stats, MDPI, vol. 5(3), pages 1-11, July.
    2. Kim, Seungkyu & Park, Seongoh & Lim, Johan & Lee, Sang Han, 2023. "Robust tests for scatter separability beyond Gaussianity," Computational Statistics & Data Analysis, Elsevier, vol. 179(C).
    3. Hsiao, Wei-Cheng & Shih, Yu-Shan, 2007. "Splitting variable selection for multivariate regression trees," Statistics & Probability Letters, Elsevier, vol. 77(3), pages 265-271, February.
    4. Sven Husmann & Antoniya Shivarova & Rick Steinert, 2019. "Cross-validated covariance estimators for high-dimensional minimum-variance portfolios," Papers 1910.13960, arXiv.org, revised Oct 2020.
    5. Kashlak, Adam B., 2021. "Non-asymptotic error controlled sparse high dimensional precision matrix estimation," Journal of Multivariate Analysis, Elsevier, vol. 181(C).
    6. Huangdi Yi & Qingzhao Zhang & Cunjie Lin & Shuangge Ma, 2022. "Information‐incorporated Gaussian graphical model for gene expression data," Biometrics, The International Biometric Society, vol. 78(2), pages 512-523, June.
    7. Zhou Tang & Zhangsheng Yu & Cheng Wang, 2020. "A fast iterative algorithm for high-dimensional differential network," Computational Statistics, Springer, vol. 35(1), pages 95-109, March.
    8. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    9. Lidan Tan & Khai X. Chiong & Hyungsik Roger Moon, 2018. "Estimation of High-Dimensional Seemingly Unrelated Regression Models," Papers 1811.05567, arXiv.org.
    10. Li, Degui, 2024. "Estimation of Large Dynamic Covariance Matrices: A Selective Review," Econometrics and Statistics, Elsevier, vol. 29(C), pages 16-30.
    11. Zeyu Wu & Cheng Wang & Weidong Liu, 2023. "A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 75(4), pages 619-648, August.
    12. Hengxu Lin & Dong Zhou & Weiqing Liu & Jiang Bian, 2021. "Deep Risk Model: A Deep Learning Solution for Mining Latent Risk Factors to Improve Covariance Matrix Estimation," Papers 2107.05201, arXiv.org, revised Oct 2021.
    13. Ata Kabán & Efstratios Palias, 2024. "A Bhattacharyya-type Conditional Error Bound for Quadratic Discriminant Analysis," Methodology and Computing in Applied Probability, Springer, vol. 26(4), pages 1-17, December.
    14. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    15. Khai X. Chiong & Hyungsik Roger Moon, 2017. "Estimation of Graphical Models using the $L_{1,2}$ Norm," Papers 1709.10038, arXiv.org, revised Oct 2017.
    16. Kai Yu & William Wheeler & Qizhai Li & Andrew W. Bergen & Neil Caporaso & Nilanjan Chatterjee & Jinbo Chen, 2010. "A Partially Linear Tree-based Regression Model for Multivariate Outcomes," Biometrics, The International Biometric Society, vol. 66(1), pages 89-96, March.
    17. Zhang, Qingzhao & Ma, Shuangge & Huang, Yuan, 2021. "Promote sign consistency in the joint estimation of precision matrices," Computational Statistics & Data Analysis, Elsevier, vol. 159(C).
    18. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    19. Brownlees, Christian & Mesters, Geert, 2021. "Detecting granular time series in large panels," Journal of Econometrics, Elsevier, vol. 220(2), pages 544-561.
    20. Anne Opschoor & André Lucas & István Barra & Dick van Dijk, 2021. "Closed-Form Multi-Factor Copula Models With Observation-Driven Dynamic Factor Loadings," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(4), pages 1066-1079, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:sankhb:v:85:y:2023:i:1:d:10.1007_s13571-023-00301-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.