IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2024i1p118-d1557263.html
   My bibliography  Save this article

ADeFS: A Deep Forest Regression-Based Model to Enhance the Performance Based on LASSO and Elastic Net

Author

Listed:
  • Zari Farhadi

    (Computerized Intelligence Systems Laboratory, Department of Computer Engineering, University of Tabriz, Tabriz 51666, Iran
    Department of Statistics, Faculty of Mathematics, Statistics and Computer Sciences, University of Tabriz, Tabriz 51666, Iran)

  • Mohammad-Reza Feizi-Derakhshi

    (Computerized Intelligence Systems Laboratory, Department of Computer Engineering, University of Tabriz, Tabriz 51666, Iran)

  • Israa Khalaf Salman Al-Tameemi

    (Computerized Intelligence Systems Laboratory, Department of Computer Engineering, University of Tabriz, Tabriz 51666, Iran)

  • Wonjoon Kim

    (Division of Future Convergence (HCI Science Major), Dongduk Women’s University, Seoul 02748, Republic of Korea)

Abstract

In tree-based algorithms like random forest and deep forest, due to the presence of numerous inefficient trees and forests in the model, the computational load increases and the efficiency decreases. To address this issue, in the present paper, a model called Automatic Deep Forest Shrinkage (ADeFS) is proposed based on shrinkage techniques. The purpose of this model is to reduce the number of trees, enhance the efficiency of the gcforest, and reduce computational load. The proposed model comprises four steps. The first step is multi-grained scanning, which carries out a sliding window strategy to scan the input data and extract the relations between features. The second step is cascade forest, which is structured layer-by-layer with a number of forests consisting of random forest (RF) and completely random forest (CRF) within each layer. In the third step, which is the innovation of this paper, shrinkage techniques such as LASSO and elastic net (EN) are employed to decrease the number of trees in the last layer of the previous step, thereby decreasing the computational load, and improving the gcforest performance. Among several shrinkage techniques, elastic net (EN) provides better performance. Finally, in the last step, the simple average ensemble method is employed to combine the remaining trees. The proposed model is evaluated by Monte Carlo simulation and three real datasets. Findings demonstrate the superior performance of the proposed ADeFS-EN model over both gcforest and RF, as well as the combination of RF with shrinkage techniques.

Suggested Citation

  • Zari Farhadi & Mohammad-Reza Feizi-Derakhshi & Israa Khalaf Salman Al-Tameemi & Wonjoon Kim, 2024. "ADeFS: A Deep Forest Regression-Based Model to Enhance the Performance Based on LASSO and Elastic Net," Mathematics, MDPI, vol. 13(1), pages 1-26, December.
  • Handle: RePEc:gam:jmathe:v:13:y:2024:i:1:p:118-:d:1557263
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/1/118/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/1/118/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Jiaman Ding & Qingbo Luo & Lianyin Jia & Jinguo You, 2020. "Deep Forest-Based Fault Diagnosis Method for Chemical Process," Mathematical Problems in Engineering, Hindawi, vol. 2020, pages 1-15, January.
    3. Harrison, David Jr. & Rubinfeld, Daniel L., 1978. "Hedonic housing prices and the demand for clean air," Journal of Environmental Economics and Management, Elsevier, vol. 5(1), pages 81-102, March.
    4. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    5. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    6. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    3. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    4. Capanu, Marinela & Giurcanu, Mihai & Begg, Colin B. & Gönen, Mithat, 2023. "Subsampling based variable selection for generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 184(C).
    5. Yu-Min Yen, 2010. "A Note on Sparse Minimum Variance Portfolios and Coordinate-Wise Descent Algorithms," Papers 1005.5082, arXiv.org, revised Sep 2013.
    6. Tomáš Plíhal, 2021. "Scheduled macroeconomic news announcements and Forex volatility forecasting," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 40(8), pages 1379-1397, December.
    7. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    8. Osamu Komori & Shinto Eguchi & John B. Copas, 2015. "Generalized t-statistic for two-group classification," Biometrics, The International Biometric Society, vol. 71(2), pages 404-416, June.
    9. Zhang, Tonglin, 2024. "Variables selection using L0 penalty," Computational Statistics & Data Analysis, Elsevier, vol. 190(C).
    10. Takumi Saegusa & Tianzhou Ma & Gang Li & Ying Qing Chen & Mei-Ling Ting Lee, 2020. "Variable Selection in Threshold Regression Model with Applications to HIV Drug Adherence Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 12(3), pages 376-398, December.
    11. Yinjun Chen & Hao Ming & Hu Yang, 2024. "Efficient variable selection for high-dimensional multiplicative models: a novel LPRE-based approach," Statistical Papers, Springer, vol. 65(6), pages 3713-3737, August.
    12. Korobilis, Dimitris, 2013. "Hierarchical shrinkage priors for dynamic regressions with many predictors," International Journal of Forecasting, Elsevier, vol. 29(1), pages 43-59.
    13. Sophie Lambert-Lacroix & Laurent Zwald, 2016. "The adaptive BerHu penalty in robust regression," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(3), pages 487-514, September.
    14. Huicong Yu & Jiaqi Wu & Weiping Zhang, 2024. "Simultaneous subgroup identification and variable selection for high dimensional data," Computational Statistics, Springer, vol. 39(6), pages 3181-3205, September.
    15. Wentao Wang & Jiaxuan Liang & Rong Liu & Yunquan Song & Min Zhang, 2022. "A Robust Variable Selection Method for Sparse Online Regression via the Elastic Net Penalty," Mathematics, MDPI, vol. 10(16), pages 1-18, August.
    16. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    17. Benjamin Poignard, 2020. "Asymptotic theory of the adaptive Sparse Group Lasso," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(1), pages 297-328, February.
    18. Qingliang Fan & Yaqian Wu, 2020. "Endogenous Treatment Effect Estimation with some Invalid and Irrelevant Instruments," Papers 2006.14998, arXiv.org.
    19. Matteo Barigozzi & Marc Hallin, 2017. "A network analysis of the volatility of high dimensional financial series," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 66(3), pages 581-605, April.
    20. Ricardo P. Masini & Marcelo C. Medeiros & Eduardo F. Mendes, 2023. "Machine learning advances for time series forecasting," Journal of Economic Surveys, Wiley Blackwell, vol. 37(1), pages 76-111, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2024:i:1:p:118-:d:1557263. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.