IDEAS home Printed from https://ideas.repec.org/a/bla/scjsta/v50y2023i3p1365-1390.html
   My bibliography  Save this article

A new reproducing kernel‐based nonlinear dimension reduction method for survival data

Author

Listed:
  • Wenquan Cui
  • Jianjun Xu
  • Yuehua Wu

Abstract

Based on the theories of sliced inverse regression (SIR) and reproducing kernel Hilbert space (RKHS), a new approach RDSIR (RKHS‐based Double SIR) to nonlinear dimension reduction for survival data is proposed. An isometric isomorphism is constructed based on the RKHS property, then the nonlinear function in the RKHS can be represented by the inner product of two elements that reside in the isomorphic feature space. Due to the censorship of survival data, double slicing is used to estimate the weight function to adjust for the censoring bias. The nonlinear sufficient dimension reduction (SDR) subspace is estimated by a generalized eigen‐decomposition problem. The asymptotic property of the estimator is established based on the perturbation theory. Finally, the performance of RDSIR is illustrated on simulated and real data. The numerical results show that RDSIR is comparable with the linear SDR method. Most importantly, RDSIR can also effectively extract nonlinearity from survival data.

Suggested Citation

  • Wenquan Cui & Jianjun Xu & Yuehua Wu, 2023. "A new reproducing kernel‐based nonlinear dimension reduction method for survival data," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 50(3), pages 1365-1390, September.
  • Handle: RePEc:bla:scjsta:v:50:y:2023:i:3:p:1365-1390
    DOI: 10.1111/sjos.12635
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/sjos.12635
    Download Restriction: no

    File URL: https://libkey.io/10.1111/sjos.12635?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. James Taylor & Jochen Einbeck, 2013. "Challenging the curse of dimensionality in multivariate local linear regression," Computational Statistics, Springer, vol. 28(3), pages 955-976, June.
    2. Yingcun Xia & Howell Tong & W. K. Li & Li‐Xing Zhu, 2002. "An adaptive estimation of dimension reduction space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 363-410, August.
    3. Xiangrong Yin & R. Dennis Cook, 2002. "Dimension reduction for the conditional kth moment in regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(2), pages 159-175, May.
    4. Qiang Sun & Ruoqing Zhu & Tao Wang & Donglin Zeng, 2019. "Counting process-based dimension reduction methods for censored outcomes," Biometrika, Biometrika Trust, vol. 106(1), pages 181-196.
    5. Yin, Xiangrong & Li, Bing & Cook, R. Dennis, 2008. "Successive direction extraction for estimating the central subspace in a multiple-index regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1733-1757, September.
    6. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    7. Li-Pang Chen, 2022. "Ultrahigh-dimensional sufficient dimension reduction for censored data with measurement error in covariates," Journal of Applied Statistics, Taylor & Francis Journals, vol. 49(5), pages 1154-1178, April.
    8. Xia, Yingcun & Zhang, Dixin & Xu, Jinfeng, 2010. "Dimension Reduction and Semiparametric Estimation of Survival Models," Journal of the American Statistical Association, American Statistical Association, vol. 105(489), pages 278-290.
    9. Wenbin Lu & Lexin Li, 2011. "Sufficient Dimension Reduction for Censored Regressions," Biometrics, The International Biometric Society, vol. 67(2), pages 513-523, June.
    10. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    11. Yanyuan Ma & Liping Zhu, 2012. "A Semiparametric Approach to Dimension Reduction," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 168-179, March.
    12. Andreas Alfons & Christophe Croux & Peter Filzmoser, 2017. "Robust Maximum Association Estimators," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(517), pages 436-445, January.
    13. Xue, Yuan & Zhang, Nan & Yin, Xiangrong & Zheng, Haitao, 2017. "Sufficient dimension reduction using Hilbert–Schmidt independence criterion," Computational Statistics & Data Analysis, Elsevier, vol. 115(C), pages 67-78.
    14. Wen, Xuerong Meggie, 2010. "On sufficient dimension reduction for proportional censorship model with covariates," Computational Statistics & Data Analysis, Elsevier, vol. 54(8), pages 1975-1982, August.
    15. Tingni Sun & Cun-Hui Zhang, 2012. "Scaled sparse linear regression," Biometrika, Biometrika Trust, vol. 99(4), pages 879-898.
    16. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    17. Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911, November.
    18. Zhang, Jia & Chen, Xin, 2019. "Robust sufficient dimension reduction via ball covariance," Computational Statistics & Data Analysis, Elsevier, vol. 140(C), pages 144-154.
    19. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    20. Lexin Li & Xiangrong Yin, 2008. "Sliced Inverse Regression with Regularizations," Biometrics, The International Biometric Society, vol. 64(1), pages 124-131, March.
    21. Maya Shevlyakova & Stephan Morgenthaler, 2014. "Sliced inverse regression for survival data," Statistical Papers, Springer, vol. 55(1), pages 209-220, February.
    22. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    23. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    24. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yang Liu & Francesca Chiaromonte & Bing Li, 2017. "Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units," Biometrics, The International Biometric Society, vol. 73(2), pages 529-539, June.
    2. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    3. Wei Sun & Lexin Li, 2012. "Multiple Loci Mapping via Model-free Variable Selection," Biometrics, The International Biometric Society, vol. 68(1), pages 12-22, March.
    4. Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    5. Wu, Runxiong & Chen, Xin, 2021. "MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    6. Changrong Yan & Dixin Zhang, 2013. "Sparse dimension reduction for survival data," Computational Statistics, Springer, vol. 28(4), pages 1835-1852, August.
    7. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
    8. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    9. Victor Chernozhukov & Christian Hansen & Yuan Liao, 2015. "A lava attack on the recovery of sums of dense and sparse signals," CeMMAP working papers CWP56/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    10. Tan, Xin Lu, 2019. "Optimal estimation of slope vector in high-dimensional linear transformation models," Journal of Multivariate Analysis, Elsevier, vol. 169(C), pages 179-204.
    11. Huicong Yu & Jiaqi Wu & Weiping Zhang, 2024. "Simultaneous subgroup identification and variable selection for high dimensional data," Computational Statistics, Springer, vol. 39(6), pages 3181-3205, September.
    12. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    13. Ricardo P. Masini & Marcelo C. Medeiros & Eduardo F. Mendes, 2023. "Machine learning advances for time series forecasting," Journal of Economic Surveys, Wiley Blackwell, vol. 37(1), pages 76-111, February.
    14. Min Chen & Yimin Lian & Zhao Chen & Zhengjun Zhang, 2017. "Sure explained variability and independence screening," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 29(4), pages 849-883, October.
    15. Aneiros, Germán & Novo, Silvia & Vieu, Philippe, 2022. "Variable selection in functional regression models: A review," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    16. Liming Wang & Xingxiang Li & Xiaoqing Wang & Peng Lai, 2022. "Unified mean-variance feature screening for ultrahigh-dimensional regression," Computational Statistics, Springer, vol. 37(4), pages 1887-1918, September.
    17. Bai, Ray & Ghosh, Malay, 2018. "High-dimensional multivariate posterior consistency under global–local shrinkage priors," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 157-170.
    18. Wang, Tao & Zhu, Lixing, 2013. "Sparse sufficient dimension reduction using optimal scoring," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 223-232.
    19. Gerda Claeskens, 2012. "Focused estimation and model averaging with penalization methods: an overview," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 66(3), pages 272-287, August.
    20. Howard D. Bondell & Brian J. Reich, 2012. "Consistent High-Dimensional Bayesian Variable Selection via Penalized Credible Regions," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1610-1624, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:scjsta:v:50:y:2023:i:3:p:1365-1390. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=0303-6898 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.