IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v128y2014icp165-185.html
   My bibliography  Save this article

Graphical model selection and estimation for high dimensional tensor data

Author

Listed:
  • He, Shiyuan
  • Yin, Jianxin
  • Li, Hongzhe
  • Wang, Xing

Abstract

Multi-way tensor data are prevalent in many scientific areas such as genomics and biomedical imaging. We consider a K-way tensor-normal distribution, where the precision matrix for each way has a graphical interpretation. We develop an l1 penalized maximum likelihood estimation and an efficient coordinate descent-based algorithm for model selection and estimation in such tensor normal graphical models. When the dimensions of the tensor are fixed, we drive the asymptotic distributions and oracle property for the proposed estimates of the precision matrices. When the dimensions diverge as the sample size goes to infinity, we present the rates of convergence of the estimates and sparsistency results. Simulation results demonstrate that the proposed estimation procedure can lead to better estimates of the precision matrices and better identifications of the graph structures defined by the precision matrices than the standard Gaussian graphical models. We illustrate the methods with an analysis of yeast gene expression data measured over different time points and under different experimental conditions.

Suggested Citation

  • He, Shiyuan & Yin, Jianxin & Li, Hongzhe & Wang, Xing, 2014. "Graphical model selection and estimation for high dimensional tensor data," Journal of Multivariate Analysis, Elsevier, vol. 128(C), pages 165-185.
  • Handle: RePEc:eee:jmvana:v:128:y:2014:i:c:p:165-185
    DOI: 10.1016/j.jmva.2014.03.007
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X14000633
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2014.03.007?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. P. Tseng, 2001. "Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization," Journal of Optimization Theory and Applications, Springer, vol. 109(3), pages 475-494, June.
    2. Yin, Jianxin & Li, Hongzhe, 2012. "Model selection and estimation in the matrix normal graphical model," Journal of Multivariate Analysis, Elsevier, vol. 107(C), pages 119-140.
    3. Lam, Clifford & Fan, Jianqing, 2009. "Sparsistency and rates of convergence in large covariance matrix estimation," LSE Research Online Documents on Economics 31540, London School of Economics and Political Science, LSE Library.
    4. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Niu, Lu & Liu, Xiumin & Zhao, Junlong, 2020. "Robust estimator of the correlation matrix with sparse Kronecker structure for a high-dimensional matrix-variate," Journal of Multivariate Analysis, Elsevier, vol. 177(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pan, Yuqing & Mai, Qing, 2020. "Efficient computation for differential network analysis with applications to quadratic discriminant analysis," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    2. Fan, Xinyan & Zhang, Qingzhao & Ma, Shuangge & Fang, Kuangnan, 2021. "Conditional score matching for high-dimensional partial graphical models," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
    3. Fang, Qian & Yu, Chen & Weiping, Zhang, 2020. "Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data," Journal of Multivariate Analysis, Elsevier, vol. 176(C).
    4. Seunghwan Lee & Sang Cheol Kim & Donghyeon Yu, 2023. "An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled lasso," Computational Statistics, Springer, vol. 38(1), pages 217-242, March.
    5. Huangdi Yi & Qingzhao Zhang & Cunjie Lin & Shuangge Ma, 2022. "Information‐incorporated Gaussian graphical model for gene expression data," Biometrics, The International Biometric Society, vol. 78(2), pages 512-523, June.
    6. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    7. Wang, Luheng & Chen, Zhao & Wang, Christina Dan & Li, Runze, 2020. "Ultrahigh dimensional precision matrix estimation via refitted cross validation," Journal of Econometrics, Elsevier, vol. 215(1), pages 118-130.
    8. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    9. Maboudou-Tchao, Edgard M. & Agboto, Vincent, 2013. "Monitoring the covariance matrix with fewer observations than variables," Computational Statistics & Data Analysis, Elsevier, vol. 64(C), pages 99-112.
    10. Lin Zhang & Andrew DiLernia & Karina Quevedo & Jazmin Camchong & Kelvin Lim & Wei Pan, 2021. "A random covariance model for bi‐level graphical modeling with application to resting‐state fMRI data," Biometrics, The International Biometric Society, vol. 77(4), pages 1385-1396, December.
    11. Chen, Xin & Yang, Dan & Xu, Yan & Xia, Yin & Wang, Dong & Shen, Haipeng, 2023. "Testing and support recovery of correlation structures for matrix-valued observations with an application to stock market data," Journal of Econometrics, Elsevier, vol. 232(2), pages 544-564.
    12. Murat Genç, 2022. "A new double-regularized regression using Liu and lasso regularization," Computational Statistics, Springer, vol. 37(1), pages 159-227, March.
    13. Tan, Kean Ming & Witten, Daniela & Shojaie, Ali, 2015. "The cluster graphical lasso for improved estimation of Gaussian graphical models," Computational Statistics & Data Analysis, Elsevier, vol. 85(C), pages 23-36.
    14. Sung, Bongjung & Lee, Jaeyong, 2023. "Covariance structure estimation with Laplace approximation," Journal of Multivariate Analysis, Elsevier, vol. 198(C).
    15. S Klaassen & J Kueck & M Spindler & V Chernozhukov, 2023. "Uniform inference in high-dimensional Gaussian graphical models," Biometrika, Biometrika Trust, vol. 110(1), pages 51-68.
    16. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    17. Katayama, Shota & Imori, Shinpei, 2014. "Lasso penalized model selection criteria for high-dimensional multivariate linear regression analysis," Journal of Multivariate Analysis, Elsevier, vol. 132(C), pages 138-150.
    18. Zachary D Kurtz & Christian L Müller & Emily R Miraldi & Dan R Littman & Martin J Blaser & Richard A Bonneau, 2015. "Sparse and Compositionally Robust Inference of Microbial Ecological Networks," PLOS Computational Biology, Public Library of Science, vol. 11(5), pages 1-25, May.
    19. Liu, Weidong & Luo, Xi, 2015. "Fast and adaptive sparse precision matrix estimation in high dimensions," Journal of Multivariate Analysis, Elsevier, vol. 135(C), pages 153-162.
    20. Vahe Avagyan & Andrés M. Alonso & Francisco J. Nogales, 2018. "D-trace estimation of a precision matrix using adaptive Lasso penalties," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(2), pages 425-447, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:128:y:2014:i:c:p:165-185. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.