IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v192y2022ics0047259x22000744.html
   My bibliography  Save this article

The beta-mixture shrinkage prior for sparse covariances with near-minimax posterior convergence rate

Author

Listed:
  • Lee, Kyoungjae
  • Jo, Seongil
  • Lee, Jaeyong

Abstract

Statistical inference for sparse covariance matrices is crucial to reveal the dependence structure of large multivariate data sets, but lacks scalable and theoretically supported Bayesian methods. In this paper, we propose a beta-mixture shrinkage prior, computationally more efficient than the spike and slab prior, for sparse covariance matrices and establish its minimax optimality in high-dimensional settings. The proposed prior consists of independent beta-mixture shrinkage and gamma priors for off-diagonal and diagonal entries, respectively. To ensure positive definiteness of the covariance matrix, we further restrict the support of the prior to the subspace of positive definite matrices. We obtain the posterior convergence rate of the induced posterior under the Frobenius norm and establish a minimax lower bound for sparse covariance matrices. The class of sparse covariance matrices for the minimax lower bound considered in this paper is controlled by the number of nonzero off-diagonal elements and has more intuitive appeal than those appeared in the literature. We show that the posterior convergence rates of the proposed methods are minimax or nearly minimax. In the simulation study, we also show that the proposed method is computationally more efficient than competitors while achieving comparable performance. Advantages of the beta-mixture shrinkage prior are demonstrated based on two real data sets.

Suggested Citation

  • Lee, Kyoungjae & Jo, Seongil & Lee, Jaeyong, 2022. "The beta-mixture shrinkage prior for sparse covariances with near-minimax posterior convergence rate," Journal of Multivariate Analysis, Elsevier, vol. 192(C).
  • Handle: RePEc:eee:jmvana:v:192:y:2022:i:c:s0047259x22000744
    DOI: 10.1016/j.jmva.2022.105067
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X22000744
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2022.105067?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Park, Trevor & Casella, George, 2008. "The Bayesian Lasso," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 681-686, June.
    3. Li, Hanning & Pati, Debdeep, 2017. "Variable selection using shrinkage priors," Computational Statistics & Data Analysis, Elsevier, vol. 107(C), pages 107-119.
    4. Carlos M. Carvalho & Nicholas G. Polson & James G. Scott, 2010. "The horseshoe estimator for sparse signals," Biometrika, Biometrika Trust, vol. 97(2), pages 465-480.
    5. Cai, Tony & Liu, Weidong, 2011. "Adaptive Thresholding for Sparse Covariance Matrix Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 672-684.
    6. Lam, Clifford & Fan, Jianqing, 2009. "Sparsistency and rates of convergence in large covariance matrix estimation," LSE Research Online Documents on Economics 31540, London School of Economics and Political Science, LSE Library.
    7. Jacob Bien & Robert J. Tibshirani, 2011. "Sparse estimation of a covariance matrix," Biometrika, Biometrika Trust, vol. 98(4), pages 807-820.
    8. Banerjee, Sayantan & Ghosal, Subhashis, 2015. "Bayesian structure learning in graphical models," Journal of Multivariate Analysis, Elsevier, vol. 136(C), pages 147-162.
    9. Touloumis, Anestis, 2015. "Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 251-261.
    10. Lingrui Gan & Naveen N. Narisetty & Feng Liang, 2019. "Bayesian Regularization for Graphical Models With Unequal Shrinkage," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(527), pages 1218-1231, July.
    11. Rothman, Adam J. & Levina, Elizaveta & Zhu, Ji, 2009. "Generalized Thresholding of Large Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 104(485), pages 177-186.
    12. Fisher, Thomas J. & Sun, Xiaoqian, 2011. "Improved Stein-type shrinkage estimators for the high-dimensional multivariate normal covariance matrix," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1909-1918, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lee, Kwangmin & Lee, Jaeyong, 2023. "Post-processed posteriors for sparse covariances," Journal of Econometrics, Elsevier, vol. 236(1).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ikeda, Yuki & Kubokawa, Tatsuya & Srivastava, Muni S., 2016. "Comparison of linear shrinkage estimators of a large covariance matrix in normal and non-normal distributions," Computational Statistics & Data Analysis, Elsevier, vol. 95(C), pages 95-108.
    2. Yang, Guangren & Liu, Yiming & Pan, Guangming, 2019. "Weighted covariance matrix estimation," Computational Statistics & Data Analysis, Elsevier, vol. 139(C), pages 82-98.
    3. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    4. Yuki Ikeda & Tatsuya Kubokawa, 2015. "Linear Shrinkage Estimation of Large Covariance Matrices with Use of Factor Models," CIRJE F-Series CIRJE-F-958, CIRJE, Faculty of Economics, University of Tokyo.
    5. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    6. Ruili Sun & Tiefeng Ma & Shuangzhe Liu & Milind Sathye, 2019. "Improved Covariance Matrix Estimation for Portfolio Risk Measurement: A Review," JRFM, MDPI, vol. 12(1), pages 1-34, March.
    7. Sung, Bongjung & Lee, Jaeyong, 2023. "Covariance structure estimation with Laplace approximation," Journal of Multivariate Analysis, Elsevier, vol. 198(C).
    8. Vahe Avagyan & Andrés M. Alonso & Francisco J. Nogales, 2018. "D-trace estimation of a precision matrix using adaptive Lasso penalties," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(2), pages 425-447, June.
    9. Choi, Young-Geun & Lim, Johan & Roy, Anindya & Park, Junyong, 2019. "Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 234-249.
    10. Ikeda, Yuki & Kubokawa, Tatsuya, 2016. "Linear shrinkage estimation of large covariance matrices using factor models," Journal of Multivariate Analysis, Elsevier, vol. 152(C), pages 61-81.
    11. Avagyan, Vahe & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
    12. Jingying Yang, 2024. "Element Aggregation for Estimation of High-Dimensional Covariance Matrices," Mathematics, MDPI, vol. 12(7), pages 1-16, March.
    13. Chen, Jia & Li, Degui & Linton, Oliver, 2019. "A new semiparametric estimation approach for large dynamic covariance matrices with multiple conditioning variables," Journal of Econometrics, Elsevier, vol. 212(1), pages 155-176.
    14. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    15. Fourdrinier, Dominique & Mezoued, Fatiha & Wells, Martin T., 2016. "Estimation of the inverse scatter matrix of an elliptically symmetric distribution," Journal of Multivariate Analysis, Elsevier, vol. 143(C), pages 32-55.
    16. Cui, Ying & Leng, Chenlei & Sun, Defeng, 2016. "Sparse estimation of high-dimensional correlation matrices," Computational Statistics & Data Analysis, Elsevier, vol. 93(C), pages 390-403.
    17. Wang, Shaoxin, 2021. "An efficient numerical method for condition number constrained covariance matrix approximation," Applied Mathematics and Computation, Elsevier, vol. 397(C).
    18. Yang, Yihe & Zhou, Jie & Pan, Jianxin, 2021. "Estimation and optimal structure selection of high-dimensional Toeplitz covariance matrix," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    19. Jianqing Fan & Yuan Liao & Martina Mincheva, 2013. "Large covariance estimation by thresholding principal orthogonal complements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(4), pages 603-680, September.
    20. Bai, Jushan & Liao, Yuan, 2012. "Efficient Estimation of Approximate Factor Models," MPRA Paper 41558, University Library of Munich, Germany.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:192:y:2022:i:c:s0047259x22000744. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.