IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i4p587-d1588218.html
   My bibliography  Save this article

Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus

Author

Listed:
  • Ke-Lin Du

    (School of Mechanical and Electrical Engineering, Guangdong University of Science and Technology, Dongguan 523668, China)

  • Rengong Zhang

    (Zhejiang Yugong Information Technology Co., Ltd., Changhe Road 475, Hangzhou 310002, China)

  • Bingchun Jiang

    (School of Mechanical and Electrical Engineering, Guangdong University of Science and Technology, Dongguan 523668, China)

  • Jie Zeng

    (Shenzhen Feng Xing Tai Bao Technology Co., Ltd., Shenzhen 518063, China)

  • Jiabin Lu

    (Faculty of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China)

Abstract

Ensemble learning and data fusion techniques play a crucial role in modern machine learning, enhancing predictive performance, robustness, and generalization. This paper provides a comprehensive survey of ensemble methods, covering foundational techniques such as bagging, boosting, and random forests, as well as advanced topics including multiclass classification, multiview learning, multiple kernel learning, and the Dempster–Shafer theory of evidence. We present a comparative analysis of ensemble learning and deep learning, highlighting their respective strengths, limitations, and synergies. Additionally, we examine the theoretical foundations of ensemble methods, including bias–variance trade-offs, margin theory, and optimization-based frameworks, while analyzing computational trade-offs related to training complexity, inference efficiency, and storage requirements. To enhance accessibility, we provide a structured comparative summary of key ensemble techniques. Furthermore, we discuss emerging research directions, such as adaptive ensemble methods, hybrid deep learning approaches, and multimodal data fusion, as well as challenges related to interpretability, model selection, and handling noisy data in high-stakes applications. By integrating theoretical insights with practical considerations, this survey serves as a valuable resource for researchers and practitioners seeking to understand the evolving landscape of ensemble learning and its future prospects.

Suggested Citation

  • Ke-Lin Du & Rengong Zhang & Bingchun Jiang & Jie Zeng & Jiabin Lu, 2025. "Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus," Mathematics, MDPI, vol. 13(4), pages 1-49, February.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:4:p:587-:d:1588218
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/4/587/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/4/587/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Ke-Lin Du & M. N. S. Swamy & Zhang-Quan Wang & Wai Ho Mow, 2023. "Matrix Factorization Techniques in Machine Learning, Signal Processing, and Statistics," Mathematics, MDPI, vol. 11(12), pages 1-50, June.
    2. Blaser, Rico & Fryzlewicz, Piotr, 2021. "Regularizing axis-aligned ensembles via data rotations that favor simpler learners," LSE Research Online Documents on Economics 107935, London School of Economics and Political Science, LSE Library.
    3. Blaser, Rico & Fryzlewicz, Piotr, 2016. "Random rotation ensembles," LSE Research Online Documents on Economics 62182, London School of Economics and Political Science, LSE Library.
    4. Peter Hall & Andrew P. Robinson, 2009. "Reducing variability of crossvalidation for smoothing-parameter choice," Biometrika, Biometrika Trust, vol. 96(1), pages 175-186.
    5. Jing Lei & Larry Wasserman, 2014. "Distribution-free prediction bands for non-parametric regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(1), pages 71-96, January.
    6. Yoonsuh Jung & Jianhua Hu, 2015. "A K -fold averaging cross-validation procedure," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 27(2), pages 167-179, June.
    7. Minerva Mukhopadhyay & David B. Dunson, 2020. "Targeted Random Projection for Prediction From High-Dimensional Features," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(532), pages 1998-2010, December.
    8. Paul Horst, 1961. "Relations amongm sets of measures," Psychometrika, Springer;The Psychometric Society, vol. 26(2), pages 129-149, June.
    9. Lin, Yi & Jeon, Yongho, 2006. "Random Forests and Adaptive Nearest Neighbors," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 578-590, June.
    10. Jaouad Mourtada & Stéphane Gaïffas & Erwan Scornet, 2021. "AMF: Aggregated Mondrian forests for online learning," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 505-533, July.
    11. Ke-Lin Du & Bingchun Jiang & Jiabin Lu & Jingyu Hua & M. N. S. Swamy, 2024. "Exploring Kernel Machines and Support Vector Machines: Principles, Techniques, and Future Directions," Mathematics, MDPI, vol. 12(24), pages 1-58, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ke-Lin Du & Rengong Zhang & Bingchun Jiang & Jie Zeng & Jiabin Lu, 2025. "Understanding Machine Learning Principles: Learning, Inference, Generalization, and Computational Learning Theory," Mathematics, MDPI, vol. 13(3), pages 1-56, January.
    2. Wang, Qing & Chen, Shiwen, 2015. "A general class of linearly extrapolated variance estimators," Statistics & Probability Letters, Elsevier, vol. 98(C), pages 29-38.
    3. Pietro Amenta & Antonio Lucadamo & Antonello D’Ambra, 2021. "Restricted Common Component and Specific Weight Analysis: A Constrained Explorative Approach for the Customer Satisfaction Evaluation," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 156(2), pages 409-427, August.
    4. Lei-Hong Zhang & Li-Zhi Liao & Li-Ming Sun, 2011. "Towards the global solution of the maximal correlation problem," Journal of Global Optimization, Springer, vol. 49(1), pages 91-107, January.
    5. Jerinsh Jeyapaulraj & Dhruv Desai & Peter Chu & Dhagash Mehta & Stefano Pasquali & Philip Sommer, 2022. "Supervised similarity learning for corporate bonds using Random Forest proximities," Papers 2207.04368, arXiv.org, revised Oct 2022.
    6. Chu, Chi-Yang & Henderson, Daniel J. & Parmeter, Christopher F., 2017. "On discrete Epanechnikov kernel functions," Computational Statistics & Data Analysis, Elsevier, vol. 116(C), pages 79-105.
    7. Walter Kristof, 1967. "Orthogonal inter-battery factor analysis," Psychometrika, Springer;The Psychometric Society, vol. 32(2), pages 199-227, June.
    8. Joshua Rosaler & Luca Candelori & Vahagn Kirakosyan & Kharen Musaelian & Ryan Samson & Martin T. Wells & Dhagash Mehta & Stefano Pasquali, 2025. "Supervised Similarity for High-Yield Corporate Bonds with Quantum Cognition Machine Learning," Papers 2502.01495, arXiv.org.
    9. David M. Ritzwoller & Vasilis Syrgkanis, 2024. "Simultaneous Inference for Local Structural Parameters with Random Forests," Papers 2405.07860, arXiv.org, revised Sep 2024.
    10. Mendez, Guillermo & Lohr, Sharon, 2011. "Estimating residual variance in random forest regression," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2937-2950, November.
    11. Li, Yiliang & Bai, Xiwen & Wang, Qi & Ma, Zhongjun, 2022. "A big data approach to cargo type prediction and its implications for oil trade estimation," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 165(C).
    12. Yi Fu & Shuai Cao & Tao Pang, 2020. "A Sustainable Quantitative Stock Selection Strategy Based on Dynamic Factor Adjustment," Sustainability, MDPI, vol. 12(10), pages 1-12, May.
    13. José María Sarabia & Faustino Prieto & Vanesa Jordá & Stefan Sperlich, 2020. "A Note on Combining Machine Learning with Statistical Modeling for Financial Data Analysis," Risks, MDPI, vol. 8(2), pages 1-14, April.
    14. Biau, Gérard & Devroye, Luc, 2010. "On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification," Journal of Multivariate Analysis, Elsevier, vol. 101(10), pages 2499-2518, November.
    15. Olivier BIAU & Angela D´ELIA, 2010. "Euro Area GDP Forecast Using Large Survey Dataset - A Random Forest Approach," EcoMod2010 259600029, EcoMod.
    16. Cleridy E. Lennert‐Cody & Richard A. Berk, 2007. "Statistical learning procedures for monitoring regulatory compliance: an application to fisheries data," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 170(3), pages 671-689, July.
    17. Susan Athey & Julie Tibshirani & Stefan Wager, 2016. "Generalized Random Forests," Papers 1610.01271, arXiv.org, revised Apr 2018.
    18. Philippe Goulet Coulombe, 2024. "The macroeconomy as a random forest," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 39(3), pages 401-421, April.
    19. Dean Fantazzini, 2024. "Adaptive Conformal Inference for Computing Market Risk Measures: An Analysis with Four Thousand Crypto-Assets," JRFM, MDPI, vol. 17(6), pages 1-44, June.
    20. Jincheng Shen & Lu Wang & Jeremy M. G. Taylor, 2017. "Estimation of the optimal regime in treatment of prostate cancer recurrence from observational data using flexible weighting models," Biometrics, The International Biometric Society, vol. 73(2), pages 635-645, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:4:p:587-:d:1588218. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.