IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v164y2021ics0167947321001365.html
   My bibliography  Save this article

Graph informed sliced inverse regression

Author

Listed:
  • Pircalabelu, Eugen
  • Artemiou, Andreas

Abstract

A new method is developed for performing sufficient dimension reduction when probabilistic graphical models are being used to estimate parameters. The procedure enriches the domain of application of dimension reduction techniques to settings where (i) p the number of variables in the model is much larger than the available sample size n, (ii) p is much larger than the number of slices H the model uses and (iii) D the number of projection vectors can be larger than the number of slices H. The methodology is developed for the case of the sliced inverse regression model, but extensions to other dimension reduction techniques such as sliced average variance estimation or other methods are straightforward.

Suggested Citation

  • Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
  • Handle: RePEc:eee:csdana:v:164:y:2021:i:c:s0167947321001365
    DOI: 10.1016/j.csda.2021.107302
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947321001365
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2021.107302?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lexin Li, 2007. "Sparse sufficient dimension reduction," Biometrika, Biometrika Trust, vol. 94(3), pages 603-613.
    2. Qian Lin & Zhigen Zhao & Jun S. Liu, 2019. "Sparse Sliced Inverse Regression via Lasso," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(528), pages 1726-1739, October.
    3. Jian Guo & Elizaveta Levina & George Michailidis & Ji Zhu, 2011. "Joint estimation of multiple graphical models," Biometrika, Biometrika Trust, vol. 98(1), pages 1-15.
    4. Yingcun Xia & Howell Tong & W. K. Li & Li‐Xing Zhu, 2002. "An adaptive estimation of dimension reduction space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 363-410, August.
    5. Shin, Seung Jun & Artemiou, Andreas, 2017. "Penalized principal logistic regression for sparse sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 48-58.
    6. Radchenko, Peter, 2015. "High dimensional single index models," Journal of Multivariate Analysis, Elsevier, vol. 139(C), pages 266-282.
    7. Zhu, Lixing & Miao, Baiqi & Peng, Heng, 2006. "On Sliced Inverse Regression With High-Dimensional Covariates," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 630-643, June.
    8. Bura, E. & Yang, J., 2011. "Dimension estimation in sufficient dimension reduction: A unifying approach," Journal of Multivariate Analysis, Elsevier, vol. 102(1), pages 130-142, January.
    9. Pircalabelu, Eugen & Claeskens, Gerda & Waldorp, Lourens J., 2016. "Mixed scale joint graphical lasso," LIDAM Reprints ISBA 2016049, Université catholique de Louvain, Institute of Statistics, Biostatistics and Actuarial Sciences (ISBA).
    10. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    11. Lam, Clifford & Fan, Jianqing, 2009. "Sparsistency and rates of convergence in large covariance matrix estimation," LSE Research Online Documents on Economics 31540, London School of Economics and Political Science, LSE Library.
    12. Patrick Danaher & Pei Wang & Daniela M. Witten, 2014. "The joint graphical lasso for inverse covariance estimation across multiple classes," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(2), pages 373-397, March.
    13. Zhu, Li-Ping & Zhu, Li-Xing, 2009. "Nonconcave penalized inverse regression in single-index models with high dimensional predictors," Journal of Multivariate Analysis, Elsevier, vol. 100(5), pages 862-875, May.
    14. Wang, Qin & Yin, Xiangrong, 2008. "A nonlinear multi-dimensional variable selection method for high dimensional data: Sparse MAVE," Computational Statistics & Data Analysis, Elsevier, vol. 52(9), pages 4512-4520, May.
    15. Aaron J Molstad & Adam J Rothman, 2018. "Shrinking characteristics of precision matrix estimators," Biometrika, Biometrika Trust, vol. 105(3), pages 563-574.
    16. Teng Zhang & Hui Zou, 2014. "Sparse precision matrix estimation via lasso penalized D-trace loss," Biometrika, Biometrika Trust, vol. 101(1), pages 103-120.
    17. Zhu, Li-Xing & Ohtaki, Megu & Li, Yingxing, 2007. "On hybrid methods of inverse regression-based algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 51(5), pages 2621-2635, February.
    18. Seung Jun Shin & Yichao Wu & Hao Helen Zhang & Yufeng Liu, 2017. "Principal weighted support vector machines for sufficient dimension reduction in binary classification," Biometrika, Biometrika Trust, vol. 104(1), pages 67-81.
    19. Luigi Augugliaro & Angelo M. Mineo & Ernst C. Wit, 2013. "Differential geometric least angle regression: a differential geometric approach to sparse generalized linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(3), pages 471-498, June.
    20. Wei Luo & Bing Li, 2016. "Combining eigenvalues and variation of eigenvectors for order determination," Biometrika, Biometrika Trust, vol. 103(4), pages 875-887.
    21. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wei Luo, 2022. "On efficient dimension reduction with respect to the interaction between two response variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(2), pages 269-294, April.
    2. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    3. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    4. Fang, Fang & Yu, Zhou, 2020. "Model averaging assisted sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    5. Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    6. Zifang Guo & Lexin Li & Wenbin Lu & Bing Li, 2015. "Groupwise Dimension Reduction via Envelope Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1515-1527, December.
    7. Chen, Canyi & Xu, Wangli & Zhu, Liping, 2022. "Distributed estimation in heterogeneous reduced rank regression: With application to order determination in sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    8. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    9. Shih‐Hao Huang & Kerby Shedden & Hsin‐wen Chang, 2023. "Inference for the dimension of a regression relationship using pseudo‐covariates," Biometrics, The International Biometric Society, vol. 79(3), pages 2394-2403, September.
    10. Qin Wang & Yuan Xue, 2023. "A structured covariance ensemble for sufficient dimension reduction," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(3), pages 777-800, September.
    11. Hayley Randall & Andreas Artemiou & Xingye Qiao, 2021. "Sufficient dimension reduction based on distance‐weighted discrimination," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 48(4), pages 1186-1211, December.
    12. Zhang, Qingzhao & Ma, Shuangge & Huang, Yuan, 2021. "Promote sign consistency in the joint estimation of precision matrices," Computational Statistics & Data Analysis, Elsevier, vol. 159(C).
    13. Nordhausen, Klaus & Oja, Hannu & Tyler, David E., 2022. "Asymptotic and bootstrap tests for subspace dimension," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    14. Jang, Hyun Jung & Shin, Seung Jun & Artemiou, Andreas, 2023. "Principal weighted least square support vector machine: An online dimension-reduction tool for binary classification," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    15. Xie, Chuanlong & Zhu, Lixing, 2020. "Generalized kernel-based inverse regression methods for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 150(C).
    16. Kim, Kyongwon, 2022. "On principal graphical models with application to gene network," Computational Statistics & Data Analysis, Elsevier, vol. 166(C).
    17. Kapla, Daniel & Fertl, Lukas & Bura, Efstathia, 2022. "Fusing sufficient dimension reduction with neural networks," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    18. Baek, Seungchul & Hoyoung, Park & Park, Junyong, 2024. "Variable selection using data splitting and projection for principal fitted component models in high dimension," Computational Statistics & Data Analysis, Elsevier, vol. 196(C).
    19. Lu Li & Kai Tan & Xuerong Meggie Wen & Zhou Yu, 2023. "Variable-dependent partial dimension reduction," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(2), pages 521-541, June.
    20. Nordhausen, Klaus & Ruiz-Gazen, Anne, 2022. "On the usage of joint diagonalization in multivariate statistics," Journal of Multivariate Analysis, Elsevier, vol. 188(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:164:y:2021:i:c:s0167947321001365. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.