IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v196y2024ics0167947324000446.html
   My bibliography  Save this article

Variable selection using data splitting and projection for principal fitted component models in high dimension

Author

Listed:
  • Baek, Seungchul
  • Hoyoung, Park
  • Park, Junyong

Abstract

Sufficient dimension reduction (SDR) is such an effective way to detect nonlinear relationship between response variable and covariates by reducing the dimensionality of covariates without information loss. The principal fitted component (PFC) model is a way to implement SDR using some class of basis functions, however the PFC model is not efficient when there are many irrelevant or noisy covariates. There have been a few studies on the selection of variables in the PFC model via penalized regression or sequential likelihood ratio test. A novel variable selection technique in the PFC model has been proposed by incorporating a recent development in multiple testing such as mirror statistics and random data splitting. It is highlighted how we construct a mirror statistic in the PFC model using the idea of projection of coefficients to the other space generated from data splitting. The proposed method is superior to some existing methods in terms of false discovery rate (FDR) control and applicability to high-dimensional cases. In particular, the proposed method outperforms other methods as the number of covariates tends to be getting larger, which would be appealing in high dimensional data analysis. Simulation studies and analyses of real data sets have been conducted to show the finite sample performance and the gain that it yields compared to existing methods.

Suggested Citation

  • Baek, Seungchul & Hoyoung, Park & Park, Junyong, 2024. "Variable selection using data splitting and projection for principal fitted component models in high dimension," Computational Statistics & Data Analysis, Elsevier, vol. 196(C).
  • Handle: RePEc:eee:csdana:v:196:y:2024:i:c:s0167947324000446
    DOI: 10.1016/j.csda.2024.107960
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947324000446
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2024.107960?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Kofi Placid Adragni, 2015. "Independent screening in high-dimensional exponential family predictors' space," Journal of Applied Statistics, Taylor & Francis Journals, vol. 42(2), pages 347-359, February.
    2. Forzani, Liliana & Rodriguez, Daniela & Smucler, Ezequiel & Sued, Mariela, 2019. "Sufficient dimension reduction and prediction in regression: Asymptotic results," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 339-349.
    3. Runze Li & Wei Zhong & Liping Zhu, 2012. "Feature Screening via Distance Correlation Learning," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(499), pages 1129-1139, September.
    4. Xiangrong Yin & Haileab Hilafu, 2015. "Sequential sufficient dimension reduction for large p, small n problems," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 77(4), pages 879-892, September.
    5. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    6. Jinzhou Li & Marloes H. Maathuis, 2021. "GGM knockoff filter: False discovery rate control for Gaussian graphical models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 534-558, July.
    7. Lexin Li, 2007. "Sparse sufficient dimension reduction," Biometrika, Biometrika Trust, vol. 94(3), pages 603-613.
    8. Adragni, Kofi Placid & Raim, Andrew M., 2014. "ldr: An R Software Package for Likelihood-Based Sufficient Dimension Reduction," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 61(i03).
    9. Li, Lexin, 2009. "Exploiting predictor domain information in sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 53(7), pages 2665-2672, May.
    10. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    11. Zhu, Li-Xing & Ohtaki, Megu & Li, Yingxing, 2007. "On hybrid methods of inverse regression-based algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 51(5), pages 2621-2635, February.
    12. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    13. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hung Hung & Su‐Yun Huang, 2019. "Sufficient dimension reduction via random‐partitions for the large‐p‐small‐n problem," Biometrics, The International Biometric Society, vol. 75(1), pages 245-255, March.
    2. Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    3. Zhou Yu & Yuexiao Dong & Li-Xing Zhu, 2016. "Trace Pursuit: A General Framework for Model-Free Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 813-821, April.
    4. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    5. Fang, Fang & Yu, Zhou, 2020. "Model averaging assisted sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    6. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    7. Wu, Runxiong & Chen, Xin, 2021. "MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    8. Yuan, Qingcong & Chen, Xianyan & Ke, Chenlu & Yin, Xiangrong, 2022. "Independence index sufficient variable screening for categorical responses," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
    9. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    10. Zhenghui Feng & Lu Lin & Ruoqing Zhu & Lixing Zhu, 2020. "Nonparametric variable selection and its application to additive models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(3), pages 827-854, June.
    11. Yang Liu & Francesca Chiaromonte & Bing Li, 2017. "Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units," Biometrics, The International Biometric Society, vol. 73(2), pages 529-539, June.
    12. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    13. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    14. He, Yong & Zhang, Liang & Ji, Jiadong & Zhang, Xinsheng, 2019. "Robust feature screening for elliptical copula regression model," Journal of Multivariate Analysis, Elsevier, vol. 173(C), pages 568-582.
    15. Zhang, Shucong & Zhou, Yong, 2018. "Variable screening for ultrahigh dimensional heterogeneous data via conditional quantile correlations," Journal of Multivariate Analysis, Elsevier, vol. 165(C), pages 1-13.
    16. Shin, Seung Jun & Artemiou, Andreas, 2017. "Penalized principal logistic regression for sparse sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 48-58.
    17. Liming Wang & Xingxiang Li & Xiaoqing Wang & Peng Lai, 2022. "Unified mean-variance feature screening for ultrahigh-dimensional regression," Computational Statistics, Springer, vol. 37(4), pages 1887-1918, September.
    18. Feng, Zheng-Hui & Lin, Lu & Zhu, Ruo-Qing & Zhu, Li-Xing, 2018. "Nonparametric Variable Selection and Its Application to Additive Models," IRTG 1792 Discussion Papers 2018-002, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    19. Emmanuel Jordy Menvouta & Sven Serneels & Tim Verdonck, 2022. "Sparse dimension reduction based on energy and ball statistics," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 16(4), pages 951-975, December.
    20. Akira Shinkyu, 2023. "Forward Selection for Feature Screening and Structure Identification in Varying Coefficient Models," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 85(1), pages 485-511, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:196:y:2024:i:c:s0167947324000446. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.