IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v122y2013icp334-354.html
   My bibliography  Save this article

PCA consistency for the power spiked model in high-dimensional settings

Author

Listed:
  • Yata, Kazuyoshi
  • Aoshima, Makoto

Abstract

In this paper, we propose a general spiked model called the power spiked model in high-dimensional settings. We derive relations among the data dimension, the sample size and the high-dimensional noise structure. We first consider asymptotic properties of the conventional estimator of eigenvalues. We show that the estimator is affected by the high-dimensional noise structure directly, so that it becomes inconsistent. In order to overcome such difficulties in a high-dimensional situation, we develop new principal component analysis (PCA) methods called the noise-reduction methodology and the cross-data-matrix methodology under the power spiked model. We show that the new PCA methods can enjoy consistency properties not only for eigenvalues but also for PC directions and PC scores in high-dimensional settings.

Suggested Citation

  • Yata, Kazuyoshi & Aoshima, Makoto, 2013. "PCA consistency for the power spiked model in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 334-354.
  • Handle: RePEc:eee:jmvana:v:122:y:2013:i:c:p:334-354
    DOI: 10.1016/j.jmva.2013.08.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X13001644
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2013.08.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Peter Hall & J. S. Marron & Amnon Neeman, 2005. "Geometric representation of high dimension, low sample size data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(3), pages 427-444, June.
    2. Johnstone, Iain M. & Lu, Arthur Yu, 2009. "On Consistency and Sparsity for Principal Components Analysis in High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 104(486), pages 682-693.
    3. Jung, Sungkyu & Sen, Arusharka & Marron, J.S., 2012. "Boundary behavior in High Dimension, Low Sample Size asymptotics of PCA," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 190-203.
    4. Baik, Jinho & Silverstein, Jack W., 2006. "Eigenvalues of large sample covariance matrices of spiked population models," Journal of Multivariate Analysis, Elsevier, vol. 97(6), pages 1382-1408, July.
    5. Yata, Kazuyoshi & Aoshima, Makoto, 2012. "Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 193-215.
    6. Yata, Kazuyoshi & Aoshima, Makoto, 2010. "Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix," Journal of Multivariate Analysis, Elsevier, vol. 101(9), pages 2060-2077, October.
    7. Chen, Song Xi & Qin, Yingli, 2010. "A Two Sample Test for High Dimensional Data with Applications to Gene-set Testing," MPRA Paper 59642, University Library of Munich, Germany.
    8. Jeongyoun Ahn & J. S. Marron & Keith M. Muller & Yueh-Yun Chi, 2007. "The high-dimension, low-sample-size geometric representation holds under mild conditions," Biometrika, Biometrika Trust, vol. 94(3), pages 760-766.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Makoto Aoshima & Kazuyoshi Yata, 2019. "Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(3), pages 473-503, June.
    2. Huang, Shih-Hao & Huang, Su-Yun, 2021. "On the asymptotic normality and efficiency of Kronecker envelope principal component analysis," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    3. Wang, Shao-Hsuan & Huang, Su-Yun & Chen, Ting-Li, 2020. "On asymptotic normality of cross data matrix-based PCA in high dimension low sample size," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    4. Okudo, Michiko & Komaki, Fumiyasu, 2021. "Shrinkage priors for single-spiked covariance models," Statistics & Probability Letters, Elsevier, vol. 176(C).
    5. Jonathan Gillard & Emily O’Riordan & Anatoly Zhigljavsky, 2023. "Polynomial whitening for high-dimensional data," Computational Statistics, Springer, vol. 38(3), pages 1427-1461, September.
    6. Bando, Takuma & Sei, Tomonari & Yata, Kazuyoshi, 2022. "Consistency of the objective general index in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 189(C).
    7. Kazuyoshi Yata & Makoto Aoshima, 2020. "Geometric consistency of principal component scores for high‐dimensional mixture models and its application," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(3), pages 899-921, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Makoto Aoshima & Kazuyoshi Yata, 2014. "A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(5), pages 983-1010, October.
    2. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "Correlation tests for high-dimensional data using extended cross-data-matrix methodology," Journal of Multivariate Analysis, Elsevier, vol. 117(C), pages 313-331.
    3. Wang, Shao-Hsuan & Huang, Su-Yun, 2022. "Perturbation theory for cross data matrix-based PCA," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    4. Shen, Dan & Shen, Haipeng & Marron, J.S., 2013. "Consistency of sparse PCA in High Dimension, Low Sample Size contexts," Journal of Multivariate Analysis, Elsevier, vol. 115(C), pages 317-333.
    5. Wang, Shao-Hsuan & Huang, Su-Yun & Chen, Ting-Li, 2020. "On asymptotic normality of cross data matrix-based PCA in high dimension low sample size," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    6. Jung, Sungkyu & Sen, Arusharka & Marron, J.S., 2012. "Boundary behavior in High Dimension, Low Sample Size asymptotics of PCA," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 190-203.
    7. Ishii, Aki & Yata, Kazuyoshi & Aoshima, Makoto, 2022. "Geometric classifiers for high-dimensional noisy data," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    8. Jianqing Fan & Yuan Liao & Martina Mincheva, 2013. "Large covariance estimation by thresholding principal orthogonal complements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(4), pages 603-680, September.
    9. Yata, Kazuyoshi & Aoshima, Makoto, 2012. "Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 193-215.
    10. Yata, Kazuyoshi & Aoshima, Makoto, 2010. "Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix," Journal of Multivariate Analysis, Elsevier, vol. 101(9), pages 2060-2077, October.
    11. Kristoffer H. Hellton & Magne Thoresen, 2017. "When and Why are Principal Component Scores a Good Tool for Visualizing High-dimensional Data?," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 44(3), pages 581-597, September.
    12. Makoto Aoshima & Kazuyoshi Yata, 2019. "Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(3), pages 473-503, June.
    13. Borysov, Petro & Hannig, Jan & Marron, J.S., 2014. "Asymptotics of hierarchical clustering for growing dimension," Journal of Multivariate Analysis, Elsevier, vol. 124(C), pages 465-479.
    14. Chung, Hee Cheol & Ahn, Jeongyoun, 2021. "Subspace rotations for high-dimensional outlier detection," Journal of Multivariate Analysis, Elsevier, vol. 183(C).
    15. Nakayama, Yugo & Yata, Kazuyoshi & Aoshima, Makoto, 2021. "Clustering by principal component analysis with Gaussian kernel in high-dimension, low-sample-size settings," Journal of Multivariate Analysis, Elsevier, vol. 185(C).
    16. Kazuyoshi Yata & Makoto Aoshima, 2020. "Geometric consistency of principal component scores for high‐dimensional mixture models and its application," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(3), pages 899-921, September.
    17. Jung, Sungkyu, 2018. "Continuum directions for supervised dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 125(C), pages 27-43.
    18. Jonathan Gillard & Emily O’Riordan & Anatoly Zhigljavsky, 2023. "Polynomial whitening for high-dimensional data," Computational Statistics, Springer, vol. 38(3), pages 1427-1461, September.
    19. Aki Ishii & Kazuyoshi Yata & Makoto Aoshima, 2021. "Hypothesis tests for high-dimensional covariance structures," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(3), pages 599-622, June.
    20. Lee, Myung Hee, 2012. "On the border of extreme and mild spiked models in the HDLSS framework," Journal of Multivariate Analysis, Elsevier, vol. 107(C), pages 162-168.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:122:y:2013:i:c:p:334-354. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.