IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v188y2022ics0047259x21001081.html
   My bibliography  Save this article

Asymptotic and bootstrap tests for subspace dimension

Author

Listed:
  • Nordhausen, Klaus
  • Oja, Hannu
  • Tyler, David E.

Abstract

Many linear dimension reduction methods proposed in the literature can be formulated using an appropriate pair of scatter matrices. The eigen-decomposition of one scatter matrix with respect to another is then often used to determine the dimension of the signal subspace and to separate signal and noise parts of the data. Three popular dimension reduction methods, namely principal component analysis (PCA), fourth order blind identification (FOBI) and sliced inverse regression (SIR) are considered in detail and the first two moments of subsets of the eigenvalues are used to test for the dimension of the signal space. The limiting null distributions of the test statistics are discussed and novel bootstrap strategies are suggested for the small sample cases. In all three cases, consistent test-based estimates of the signal subspace dimension are introduced as well. The asymptotic and bootstrap tests are illustrated in real data examples.

Suggested Citation

  • Nordhausen, Klaus & Oja, Hannu & Tyler, David E., 2022. "Asymptotic and bootstrap tests for subspace dimension," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
  • Handle: RePEc:eee:jmvana:v:188:y:2022:i:c:s0047259x21001081
    DOI: 10.1016/j.jmva.2021.104830
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X21001081
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2021.104830?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Annaliisa Kankainen & Sara Taskinen & Hannu Oja, 2007. "Tests of multinormality based on location vectors and scatter matrices," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 16(3), pages 357-379, November.
    2. Dray, Stephane, 2008. "On the number of principal components: A test of dimensionality based on measurements of similarity between matrices," Computational Statistics & Data Analysis, Elsevier, vol. 52(4), pages 2228-2237, January.
    3. David E. Tyler & Frank Critchley & Lutz Dümbgen & Hannu Oja, 2009. "Invariant co‐ordinate selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 549-592, June.
    4. Zhu, Lixing & Miao, Baiqi & Peng, Heng, 2006. "On Sliced Inverse Regression With High-Dimensional Covariates," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 630-643, June.
    5. Bura, E. & Yang, J., 2011. "Dimension estimation in sufficient dimension reduction: A unifying approach," Journal of Multivariate Analysis, Elsevier, vol. 102(1), pages 130-142, January.
    6. Thomas P. Hettmansperger, 2002. "A practical affine equivariant multivariate median," Biometrika, Biometrika Trust, vol. 89(4), pages 851-860, December.
    7. Nordhausen, Klaus & Oja, Hannu & Tyler, David E., 2008. "Tools for Exploring Multivariate Data: The Package ICS," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 28(i06).
    8. Liping Zhu & Tao Wang & Lixing Zhu & Louis Ferré, 2010. "Sufficient dimension reduction through discretization-expectation estimation," Biometrika, Biometrika Trust, vol. 97(2), pages 295-304.
    9. Ye Z. & Weiss R.E., 2003. "Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 968-979, January.
    10. Eaton, M. L. & Tyler, D., 1994. "The Asymptotic Distribution of Singular-Values with Applications to Canonical Correlations and Correspondence Analysis," Journal of Multivariate Analysis, Elsevier, vol. 50(2), pages 238-264, August.
    11. Klaus Nordhausen & David E. Tyler, 2015. "A cautionary note on robust covariance plug-in methods," Biometrika, Biometrika Trust, vol. 102(3), pages 573-588.
    12. Weisberg, Sanford, 2002. "Dimension Reduction Regression in R," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 7(i01).
    13. Schott, James R., 2006. "A high-dimensional test for the equality of the smallest eigenvalues of a covariance matrix," Journal of Multivariate Analysis, Elsevier, vol. 97(4), pages 827-843, April.
    14. Wei Luo & Bing Li, 2016. "Combining eigenvalues and variation of eigenvectors for order determination," Biometrika, Biometrika Trust, vol. 103(4), pages 875-887.
    15. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bernard, Gaspard & Verdebout, Thomas, 2024. "On testing the equality of latent roots of scatter matrices under ellipticity," Journal of Multivariate Analysis, Elsevier, vol. 199(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nordhausen, Klaus & Ruiz-Gazen, Anne, 2022. "On the usage of joint diagonalization in multivariate statistics," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    2. Joni Virta & Niko Lietzén & Henri Nyberg, 2024. "Robust signal dimension estimation via SURE," Statistical Papers, Springer, vol. 65(5), pages 3007-3038, July.
    3. Shih‐Hao Huang & Kerby Shedden & Hsin‐wen Chang, 2023. "Inference for the dimension of a regression relationship using pseudo‐covariates," Biometrics, The International Biometric Society, vol. 79(3), pages 2394-2403, September.
    4. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    5. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    6. Chen, Canyi & Xu, Wangli & Zhu, Liping, 2022. "Distributed estimation in heterogeneous reduced rank regression: With application to order determination in sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    7. Kim, Kyongwon, 2022. "On principal graphical models with application to gene network," Computational Statistics & Data Analysis, Elsevier, vol. 166(C).
    8. Archimbaud, Aurore & Nordhausen, Klaus & Ruiz-Gazen, Anne, 2018. "ICS for multivariate outlier detection with application to quality control," Computational Statistics & Data Analysis, Elsevier, vol. 128(C), pages 184-199.
    9. Wei Luo, 2022. "On efficient dimension reduction with respect to the interaction between two response variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(2), pages 269-294, April.
    10. Zhu, Xuehu & Guo, Xu & Wang, Tao & Zhu, Lixing, 2020. "Dimensionality determination: A thresholding double ridge ratio approach," Computational Statistics & Data Analysis, Elsevier, vol. 146(C).
    11. Bernard, Gaspard & Verdebout, Thomas, 2024. "On testing the equality of latent roots of scatter matrices under ellipticity," Journal of Multivariate Analysis, Elsevier, vol. 199(C).
    12. Zhou, Jingke & Xu, Wangli & Zhu, Lixing, 2015. "Robust estimating equation-based sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 134(C), pages 99-118.
    13. Zeng, Yicheng & Zhu, Lixing, 2023. "Order determination for spiked-type models with a divergent number of spikes," Computational Statistics & Data Analysis, Elsevier, vol. 182(C).
    14. Xie, Chuanlong & Zhu, Lixing, 2020. "Generalized kernel-based inverse regression methods for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 150(C).
    15. Kapla, Daniel & Fertl, Lukas & Bura, Efstathia, 2022. "Fusing sufficient dimension reduction with neural networks," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    16. Lu Li & Kai Tan & Xuerong Meggie Wen & Zhou Yu, 2023. "Variable-dependent partial dimension reduction," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(2), pages 521-541, June.
    17. Dümbgen, Lutz & Nordhausen, Klaus & Schuhmacher, Heike, 2016. "New algorithms for M-estimation of multivariate scatter and location," Journal of Multivariate Analysis, Elsevier, vol. 144(C), pages 200-217.
    18. Becquart, Colombe & Archimbaud, Aurore & Ruiz-Gazen, Anne & Prilé, Luka & Nordhausen, Klaus, 2024. "Invariant Coordinate Selection and Fisher Discriminant Subspace Beyond The Case of Two Groups," TSE Working Papers 24-1579, Toulouse School of Economics (TSE).
    19. Qin Wang & Yuan Xue, 2023. "A structured covariance ensemble for sufficient dimension reduction," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(3), pages 777-800, September.
    20. Hung Hung & Su‐Yun Huang, 2019. "Sufficient dimension reduction via random‐partitions for the large‐p‐small‐n problem," Biometrics, The International Biometric Society, vol. 75(1), pages 245-255, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:188:y:2022:i:c:s0047259x21001081. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.