IDEAS home Printed from https://ideas.repec.org/a/bla/biomet/v75y2019i1p245-255.html
   My bibliography  Save this article

Sufficient dimension reduction via random‐partitions for the large‐p‐small‐n problem

Author

Listed:
  • Hung Hung
  • Su‐Yun Huang

Abstract

Sufficient dimension reduction (SDR) continues to be an active field of research. When estimating the central subspace (CS), inverse regression based SDR methods involve solving a generalized eigenvalue problem, which can be problematic under the large‐p‐small‐n situation. In recent years, new techniques have emerged in numerical linear algebra, called randomized algorithms or random sketching, for high‐dimensional and large scale problems. To overcome the large‐p‐small‐n SDR problem, we combine the idea of statistical inference with random sketching to propose a new SDR method, called integrated random‐partition SDR (iRP‐SDR). Our method consists of the following three steps: (i) Randomly partition the covariates into subsets to construct an envelope subspace with low dimension. (ii) Obtain a sketch of the CS by applying a conventional SDR method within the constructed envelope subspace. (iii) Repeat the above two steps many times and integrate these multiple sketches to form the final estimate of the CS. After describing the details of these steps, the asymptotic properties of iRP‐SDR are established. Unlike existing methods, iRP‐SDR does not involve the determination of the structural dimension until the last stage, which makes it more adaptive to a high‐dimensional setting. The advantageous performance of iRP‐SDR is demonstrated via simulation studies and a practical example analyzing EEG data.

Suggested Citation

  • Hung Hung & Su‐Yun Huang, 2019. "Sufficient dimension reduction via random‐partitions for the large‐p‐small‐n problem," Biometrics, The International Biometric Society, vol. 75(1), pages 245-255, March.
  • Handle: RePEc:bla:biomet:v:75:y:2019:i:1:p:245-255
    DOI: 10.1111/biom.12926
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/biom.12926
    Download Restriction: no

    File URL: https://libkey.io/10.1111/biom.12926?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lexin Li, 2007. "Sparse sufficient dimension reduction," Biometrika, Biometrika Trust, vol. 94(3), pages 603-613.
    2. Lexin Li & R. Dennis Cook & Chih-Ling Tsai, 2007. "Partial inverse regression," Biometrika, Biometrika Trust, vol. 94(3), pages 615-625.
    3. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    4. Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911, November.
    5. Runze Li & Wei Zhong & Liping Zhu, 2012. "Feature Screening via Distance Correlation Learning," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(499), pages 1129-1139, September.
    6. Xiangrong Yin & Haileab Hilafu, 2015. "Sequential sufficient dimension reduction for large p, small n problems," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 77(4), pages 879-892, September.
    7. R. Dennis Cook & Bing Li & Francesca Chiaromonte, 2007. "Dimension reduction in regression without matrix inversion," Biometrika, Biometrika Trust, vol. 94(3), pages 569-584.
    8. Hua Zhou & Lexin Li, 2014. "Regularized matrix regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(2), pages 463-483, March.
    9. Liping Zhu & Tao Wang & Lixing Zhu & Louis Ferré, 2010. "Sufficient dimension reduction through discretization-expectation estimation," Biometrika, Biometrika Trust, vol. 97(2), pages 295-304.
    10. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    2. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    3. Baek, Seungchul & Hoyoung, Park & Park, Junyong, 2024. "Variable selection using data splitting and projection for principal fitted component models in high dimension," Computational Statistics & Data Analysis, Elsevier, vol. 196(C).
    4. Hojin Yang & Hongtu Zhu & Joseph G. Ibrahim, 2018. "MILFM: Multiple index latent factor model based on high‐dimensional features," Biometrics, The International Biometric Society, vol. 74(3), pages 834-844, September.
    5. Zhenghui Feng & Lu Lin & Ruoqing Zhu & Lixing Zhu, 2020. "Nonparametric variable selection and its application to additive models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(3), pages 827-854, June.
    6. Zifang Guo & Lexin Li & Wenbin Lu & Bing Li, 2015. "Groupwise Dimension Reduction via Envelope Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1515-1527, December.
    7. Zhou Yu & Yuexiao Dong & Li-Xing Zhu, 2016. "Trace Pursuit: A General Framework for Model-Free Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 813-821, April.
    8. Qin Wang & Yuan Xue, 2023. "A structured covariance ensemble for sufficient dimension reduction," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(3), pages 777-800, September.
    9. Yuan, Qingcong & Chen, Xianyan & Ke, Chenlu & Yin, Xiangrong, 2022. "Independence index sufficient variable screening for categorical responses," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
    10. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    11. Yang, Baoying & Yin, Xiangrong & Zhang, Nan, 2019. "Sufficient variable selection using independence measures for continuous response," Journal of Multivariate Analysis, Elsevier, vol. 173(C), pages 480-493.
    12. Ke, Chenlu & Yang, Wei & Yuan, Qingcong & Li, Lu, 2023. "Partial sufficient variable screening with categorical controls," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    13. Barbarino, Alessandro & Bura, Efstathia, 2024. "Forecasting Near-equivalence of Linear Dimension Reduction Methods in Large Panels of Macro-variables," Econometrics and Statistics, Elsevier, vol. 31(C), pages 1-18.
    14. Fang, Fang & Yu, Zhou, 2020. "Model averaging assisted sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    15. Feng, Zheng-Hui & Lin, Lu & Zhu, Ruo-Qing & Zhu, Li-Xing, 2018. "Nonparametric Variable Selection and Its Application to Additive Models," IRTG 1792 Discussion Papers 2018-002, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    16. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    17. Wang, Tao & Zhu, Lixing, 2013. "Sparse sufficient dimension reduction using optimal scoring," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 223-232.
    18. Emmanuel Jordy Menvouta & Sven Serneels & Tim Verdonck, 2022. "Sparse dimension reduction based on energy and ball statistics," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 16(4), pages 951-975, December.
    19. Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    20. Lin, Lu & Sun, Jing & Zhu, Lixing, 2013. "Nonparametric feature screening," Computational Statistics & Data Analysis, Elsevier, vol. 67(C), pages 162-174.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:biomet:v:75:y:2019:i:1:p:245-255. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=0006-341X .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.