IDEAS home Printed from https://ideas.repec.org/p/ehl/lserod/31540.html
   My bibliography  Save this paper

Sparsistency and rates of convergence in large covariance matrix estimation

Author

Listed:
  • Lam, Clifford
  • Fan, Jianqing

Abstract

This paper studies the sparsistency and rates of convergence for estimating sparse covariance and precision matrices based on penalized likelihood with nonconvex penalty functions. Here, sparsistency refers to the property that all parameters that are zero are actually estimated as zero with probability tending to one. Depending on the case of applications, sparsity priori may occur on the covariance matrix, its inverse or its Cholesky decomposition. We study these three sparsity exploration problems under a unified framework with a general penalty function. We show that the rates of convergence for these problems under the Frobenius norm are of order (sn log pn/n)1/2, where sn is the number of nonzero elements, pn is the size of the covariance matrix and n is the sample size. This explicitly spells out the contribution of high-dimensionality is merely of a logarithmic factor. The conditions on the rate with which the tuning parameter λn goes to 0 have been made explicit and compared under different penalties. As a result, for the L1-penalty, to guarantee the sparsistency and optimal rate of convergence, the number of nonzero elements should be small: at most, among parameters, for estimating sparse covariance or correlation matrix, sparse precision or inverse correlation matrix or sparse Cholesky factor, where is the number of the nonzero elements on the off-diagonal entries. On the other hand, using the SCAD or hard-thresholding penalty functions, there is no such a restriction.

Suggested Citation

  • Lam, Clifford & Fan, Jianqing, 2009. "Sparsistency and rates of convergence in large covariance matrix estimation," LSE Research Online Documents on Economics 31540, London School of Economics and Political Science, LSE Library.
  • Handle: RePEc:ehl:lserod:31540
    as

    Download full text from publisher

    File URL: http://eprints.lse.ac.uk/31540/
    File Function: Open access version.
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Wei Biao Wu, 2003. "Nonparametric estimation of large covariance matrices of longitudinal data," Biometrika, Biometrika Trust, vol. 90(4), pages 831-844, December.
    2. Frederick Wong, 2003. "Efficient estimation of covariance selection models," Biometrika, Biometrika Trust, vol. 90(4), pages 809-830, December.
    3. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    4. Smith M. & Kohn R., 2002. "Parsimonious Covariance Matrix Estimation for Longitudinal Data," Journal of the American Statistical Association, American Statistical Association, vol. 97, pages 1141-1153, December.
    5. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    6. Jianhua Z. Huang & Naiping Liu & Mohsen Pourahmadi & Linxu Liu, 2006. "Covariance matrix selection and estimation via penalised normal likelihood," Biometrika, Biometrika Trust, vol. 93(1), pages 85-98, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lam, Clifford, 2008. "Estimation of large precision matrices through block penalization," LSE Research Online Documents on Economics 31543, London School of Economics and Political Science, LSE Library.
    2. Benjamin Poignard & Manabu Asai, 2023. "Estimation of high-dimensional vector autoregression via sparse precision matrix," The Econometrics Journal, Royal Economic Society, vol. 26(2), pages 307-326.
    3. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    4. Sung, Bongjung & Lee, Jaeyong, 2023. "Covariance structure estimation with Laplace approximation," Journal of Multivariate Analysis, Elsevier, vol. 198(C).
    5. John Stephen Yap & Jianqing Fan & Rongling Wu, 2009. "Nonparametric Modeling of Longitudinal Covariance Structure in Functional Mapping of Quantitative Trait Loci," Biometrics, The International Biometric Society, vol. 65(4), pages 1068-1077, December.
    6. Peter Bickel & Bo Li & Alexandre Tsybakov & Sara Geer & Bin Yu & Teófilo Valdés & Carlos Rivero & Jianqing Fan & Aad Vaart, 2006. "Regularization in statistics," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 15(2), pages 271-344, September.
    7. Giraud Christophe & Huet Sylvie & Verzelen Nicolas, 2012. "Graph Selection with GGMselect," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 11(3), pages 1-52, February.
    8. Xi Luo, 2011. "Recovering Model Structures from Large Low Rank and Sparse Covariance Matrix Estimation," Papers 1111.1133, arXiv.org, revised Mar 2013.
    9. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    10. Wei Lan & Ronghua Luo & Chih-Ling Tsai & Hansheng Wang & Yunhong Yang, 2015. "Testing the Diagonality of a Large Covariance Matrix in a Regression Setting," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 33(1), pages 76-86, January.
    11. Wang, Luheng & Chen, Zhao & Wang, Christina Dan & Li, Runze, 2020. "Ultrahigh dimensional precision matrix estimation via refitted cross validation," Journal of Econometrics, Elsevier, vol. 215(1), pages 118-130.
    12. Paolo Giordani & Xiuyan Mun & Robert Kohn, 2012. "Efficient Estimation of Covariance Matrices using Posterior Mode Multiple Shrinkage," Journal of Financial Econometrics, Oxford University Press, vol. 11(1), pages 154-192, December.
    13. Xiaoping Zhou & Dmitry Malioutov & Frank J. Fabozzi & Svetlozar T. Rachev, 2014. "Smooth monotone covariance for elliptical distributions and applications in finance," Quantitative Finance, Taylor & Francis Journals, vol. 14(9), pages 1555-1571, September.
    14. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    15. Lee, Wonyul & Liu, Yufeng, 2012. "Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood," Journal of Multivariate Analysis, Elsevier, vol. 111(C), pages 241-255.
    16. Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
    17. Jianqing Fan & Yuan Liao & Han Liu, 2016. "An overview of the estimation of large covariance and precision matrices," Econometrics Journal, Royal Economic Society, vol. 19(1), pages 1-32, February.
    18. Pesaran, M. Hashem & Yamagata, Takashi, 2012. "Testing CAPM with a Large Number of Assets," IZA Discussion Papers 6469, Institute of Labor Economics (IZA).
    19. Dimitris Korobilis, 2013. "Var Forecasting Using Bayesian Variable Selection," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 28(2), pages 204-230, March.
    20. Lichun Wang & Yuan You & Heng Lian, 2015. "Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models," Statistical Papers, Springer, vol. 56(3), pages 819-828, August.

    More about this item

    Keywords

    Covariance matrix; high dimensionality; consistency; nonconcave penalized likelihood; sparsistency; asymptotic normality;
    All these keywords.

    JEL classification:

    • C1 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ehl:lserod:31540. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: LSERO Manager (email available below). General contact details of provider: https://edirc.repec.org/data/lsepsuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.