IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v128y2018icp292-307.html
   My bibliography  Save this article

An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation

Author

Listed:
  • Li, Peili
  • Xiao, Yunhai

Abstract

Estimating large and sparse inverse covariance matrix plays a fundamental role in modern multivariate analysis, because the zero entries capture the conditional independence between pairs of variables given all other variables. This estimation task can be realized by penalizing the maximum likelihood estimation with an adaptive group lasso penalty imposed directly on the elements of the inverse, which allows the estimated to have a blockwise sparse structure that is particularly useful in some applications. In the paper, we are particularly interested in studying the implementation of optimization algorithms for minimizing a class of log-determinant model. This considered minimization model, one the one hand, contains a large number of popular sparse models as special cases, but on the other hand, it poses more challenges especially in high-dimensional situations. Instead of targeting the challenging optimization problem directly, we employ the symmetric Gauss–Seidel (sGS) iteration based alternating direction method of multipliers (ADMM) to tackle the 3-block nonsmooth dual program. By choosing an appropriate proximal term, it was shown that the implemented sGS-ADMM is equivalent to the 2-block ADMM, so its convergence is followed directly from some existing theoretical results. Numerical experiments on synthetic data and real data sets, including the performance comparisons with the directly extended ADMM, demonstrate that the implemented algorithm is effective in estimating large and sparse inverse covariance matrices.

Suggested Citation

  • Li, Peili & Xiao, Yunhai, 2018. "An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation," Computational Statistics & Data Analysis, Elsevier, vol. 128(C), pages 292-307.
  • Handle: RePEc:eee:csdana:v:128:y:2018:i:c:p:292-307
    DOI: 10.1016/j.csda.2018.07.011
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947318301774
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2018.07.011?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Adam J. Rothman, 2012. "Positive definite estimators of large covariance matrices," Biometrika, Biometrika Trust, vol. 99(3), pages 733-740.
    2. Cai, Tony & Liu, Weidong, 2011. "Adaptive Thresholding for Sparse Covariance Matrix Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 672-684.
    3. Lam, Clifford & Fan, Jianqing, 2009. "Sparsistency and rates of convergence in large covariance matrix estimation," LSE Research Online Documents on Economics 31540, London School of Economics and Political Science, LSE Library.
    4. Patrick Danaher & Pei Wang & Daniela M. Witten, 2014. "The joint graphical lasso for inverse covariance estimation across multiple classes," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(2), pages 373-397, March.
    5. Rothman, Adam J. & Levina, Elizaveta & Zhu, Ji, 2009. "Generalized Thresholding of Large Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 104(485), pages 177-186.
    6. NESTEROV, Yu., 2005. "Smooth minimization of non-smooth functions," LIDAM Reprints CORE 1819, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    7. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    8. Dobra, Adrian & Hans, Chris & Jones, Beatrix & Nevins, J.R.Joseph R. & Yao, Guang & West, Mike, 2004. "Sparse graphical models for exploring gene expression data," Journal of Multivariate Analysis, Elsevier, vol. 90(1), pages 196-212, July.
    9. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    10. Jianhua Z. Huang & Naiping Liu & Mohsen Pourahmadi & Linxu Liu, 2006. "Covariance matrix selection and estimation via penalised normal likelihood," Biometrika, Biometrika Trust, vol. 93(1), pages 85-98, March.
    11. Lingzhou Xue & Shiqian Ma & Hui Zou, 2012. "Positive-Definite ℓ 1 -Penalized Estimation of Large Covariance Matrices," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1480-1491, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Takashi Nakagaki & Mituhiro Fukuda & Sunyoung Kim & Makoto Yamashita, 2020. "A dual spectral projected gradient method for log-determinant semidefinite problems," Computational Optimization and Applications, Springer, vol. 76(1), pages 33-68, May.
    2. Yanyun Ding & Peili Li & Yunhai Xiao & Haibin Zhang, 2023. "Efficient dual ADMMs for sparse compressive sensing MRI reconstruction," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 97(2), pages 207-231, April.
    3. Aifen Feng & Jingya Fan & Zhengfen Jin & Mengmeng Zhao & Xiaogai Chang, 2023. "Research Based on High-Dimensional Fused Lasso Partially Linear Model," Mathematics, MDPI, vol. 11(12), pages 1-15, June.
    4. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    5. Li, Xin & Wu, Dongya & Li, Chong & Wang, Jinhua & Yao, Jen-Chih, 2020. "Sparse recovery via nonconvex regularized M-estimators over ℓq-balls," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    2. Ziqi Chen & Chenlei Leng, 2016. "Dynamic Covariance Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1196-1207, July.
    3. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    4. Cui, Ying & Leng, Chenlei & Sun, Defeng, 2016. "Sparse estimation of high-dimensional correlation matrices," Computational Statistics & Data Analysis, Elsevier, vol. 93(C), pages 390-403.
    5. Yang, Yihe & Zhou, Jie & Pan, Jianxin, 2021. "Estimation and optimal structure selection of high-dimensional Toeplitz covariance matrix," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    6. Wang, Luheng & Chen, Zhao & Wang, Christina Dan & Li, Runze, 2020. "Ultrahigh dimensional precision matrix estimation via refitted cross validation," Journal of Econometrics, Elsevier, vol. 215(1), pages 118-130.
    7. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    8. Vahe Avagyan & Andrés M. Alonso & Francisco J. Nogales, 2018. "D-trace estimation of a precision matrix using adaptive Lasso penalties," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(2), pages 425-447, June.
    9. Banerjee, Sayantan & Ghosal, Subhashis, 2015. "Bayesian structure learning in graphical models," Journal of Multivariate Analysis, Elsevier, vol. 136(C), pages 147-162.
    10. Choi, Young-Geun & Lim, Johan & Roy, Anindya & Park, Junyong, 2019. "Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 234-249.
    11. Denis Belomestny & Mathias Trabs & Alexandre Tsybakov, 2017. "Sparse covariance matrix estimation in high-dimensional deconvolution," Working Papers 2017-25, Center for Research in Economics and Statistics.
    12. Avagyan, Vahe & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
    13. Benjamin Poignard & Manabu Asai, 2023. "Estimation of high-dimensional vector autoregression via sparse precision matrix," The Econometrics Journal, Royal Economic Society, vol. 26(2), pages 307-326.
    14. Kashlak, Adam B., 2021. "Non-asymptotic error controlled sparse high dimensional precision matrix estimation," Journal of Multivariate Analysis, Elsevier, vol. 181(C).
    15. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    16. Jingying Yang, 2024. "Element Aggregation for Estimation of High-Dimensional Covariance Matrices," Mathematics, MDPI, vol. 12(7), pages 1-16, March.
    17. Arnab Chakrabarti & Rituparna Sen, 2018. "Some Statistical Problems with High Dimensional Financial data," Papers 1808.02953, arXiv.org.
    18. Kang, Xiaoning & Wang, Mingqiu, 2021. "Ensemble sparse estimation of covariance structure for exploring genetic disease data," Computational Statistics & Data Analysis, Elsevier, vol. 159(C).
    19. Chen, Xin & Yang, Dan & Xu, Yan & Xia, Yin & Wang, Dong & Shen, Haipeng, 2023. "Testing and support recovery of correlation structures for matrix-valued observations with an application to stock market data," Journal of Econometrics, Elsevier, vol. 232(2), pages 544-564.
    20. Joo, Young C. & Park, Sung Y., 2021. "Optimal portfolio selection using a simple double-shrinkage selection rule," Finance Research Letters, Elsevier, vol. 43(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:128:y:2018:i:c:p:292-307. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.