IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v178y2023ics0167947322002006.html
   My bibliography  Save this article

Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures

Author

Listed:
  • Yang, Yuehan
  • Xia, Siwei
  • Yang, Hu

Abstract

Multivariate regression models are widely used in various fields for fitting multiple responses. In this paper, we proposed a sparse Laplacian shrinkage estimator for the high-dimensional multivariate regression models. We consider two graphical structures among predictors and responses. The proposed method explores the regression relationship allowing the predictors and responses derived from different multivariate normal distributions with general covariance matrices. In practice, the correlations within data are often complex and interact with each other based on the regression function. The proposed method solves this problem by building a structured penalty to encourage the shared structure between the graphs and the regression coefficients. We provide theoretical results under reasonable conditions and discuss the related algorithm. The effectiveness of the proposed method is demonstrated in a variety of simulations as well as an application to the index tracking problem in the stock market.

Suggested Citation

  • Yang, Yuehan & Xia, Siwei & Yang, Hu, 2023. "Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures," Computational Statistics & Data Analysis, Elsevier, vol. 178(C).
  • Handle: RePEc:eee:csdana:v:178:y:2023:i:c:s0167947322002006
    DOI: 10.1016/j.csda.2022.107620
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947322002006
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2022.107620?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    2. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    3. A. Mukherjee & K. Chen & N. Wang & J. Zhu, 2015. "On the degrees of freedom of reduced-rank estimators in multivariate regression," Biometrika, Biometrika Trust, vol. 102(2), pages 457-477.
    4. Tao Zou & Wei Lan & Hansheng Wang & Chih-Ling Tsai, 2017. "Covariance Regression Analysis," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(517), pages 266-281, January.
    5. Yanming Li & Bin Nan & Ji Zhu, 2015. "Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure," Biometrics, The International Biometric Society, vol. 71(2), pages 354-363, June.
    6. T. Tony Cai & Hongzhe Li & Weidong Liu & Jichun Xie, 2013. "Covariate-adjusted precision matrix estimation with an application in genetical genomics," Biometrika, Biometrika Trust, vol. 100(1), pages 139-156.
    7. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    8. Jianqing Fan & Jingjin Zhang & Ke Yu, 2012. "Vast Portfolio Selection With Gross-Exposure Constraints," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(498), pages 592-606, June.
    9. Aaron J. Molstad & Adam J. Rothman, 2016. "Indirect multivariate response linear regression," Biometrika, Biometrika Trust, vol. 103(3), pages 595-607.
    10. Lexin Li & Xin Zhang, 2017. "Parsimonious Tensor Response Regression," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 1131-1146, July.
    11. I. Wilms & C. Croux, 2018. "An algorithm for the multivariate group lasso with covariance estimation," Journal of Applied Statistics, Taylor & Francis Journals, vol. 45(4), pages 668-681, March.
    12. Lee, Wonyul & Liu, Yufeng, 2012. "Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood," Journal of Multivariate Analysis, Elsevier, vol. 111(C), pages 241-255.
    13. Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
    14. Wu, Lan & Yang, Yuehan & Liu, Hanzhong, 2014. "Nonnegative-lasso and application in index tracking," Computational Statistics & Data Analysis, Elsevier, vol. 70(C), pages 116-126.
    15. Kun Chen & Hongbo Dong & Kung-Sik Chan, 2013. "Reduced rank regression via adaptive nuclear norm penalization," Biometrika, Biometrika Trust, vol. 100(4), pages 901-920.
    16. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    17. Lisha Chen & Jianhua Z. Huang, 2012. "Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1533-1545, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Luo, Chongliang & Liang, Jian & Li, Gen & Wang, Fei & Zhang, Changshui & Dey, Dipak K. & Chen, Kun, 2018. "Leveraging mixed and incomplete outcomes via reduced-rank modeling," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 378-394.
    2. Siwei Xia & Yuehan Yang & Hu Yang, 2022. "Sparse Laplacian Shrinkage with the Graphical Lasso Estimator for Regression Problems," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 255-277, March.
    3. Bai, Ray & Ghosh, Malay, 2018. "High-dimensional multivariate posterior consistency under global–local shrinkage priors," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 157-170.
    4. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    5. Goh, Gyuhyeong & Dey, Dipak K. & Chen, Kun, 2017. "Bayesian sparse reduced rank multivariate regression," Journal of Multivariate Analysis, Elsevier, vol. 157(C), pages 14-28.
    6. Kohei Yoshikawa & Shuichi Kawano, 2023. "Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization," Computational Statistics, Springer, vol. 38(1), pages 53-75, March.
    7. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    8. Dong Liu & Changwei Zhao & Yong He & Lei Liu & Ying Guo & Xinsheng Zhang, 2023. "Simultaneous cluster structure learning and estimation of heterogeneous graphs for matrix‐variate fMRI data," Biometrics, The International Biometric Society, vol. 79(3), pages 2246-2259, September.
    9. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    10. Liu, Jianyu & Yu, Guan & Liu, Yufeng, 2019. "Graph-based sparse linear discriminant analysis for high-dimensional classification," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 250-269.
    11. Lichun Wang & Yuan You & Heng Lian, 2015. "Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models," Statistical Papers, Springer, vol. 56(3), pages 819-828, August.
    12. Dmitry Kobak & Yves Bernaerts & Marissa A. Weis & Federico Scala & Andreas S. Tolias & Philipp Berens, 2021. "Sparse reduced‐rank regression for exploratory visualisation of paired multivariate data," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(4), pages 980-1000, August.
    13. Canhong Wen & Zhenduo Li & Ruipeng Dong & Yijin Ni & Wenliang Pan, 2023. "Simultaneous Dimension Reduction and Variable Selection for Multinomial Logistic Regression," INFORMS Journal on Computing, INFORMS, vol. 35(5), pages 1044-1060, September.
    14. Faisal Zahid & Gerhard Tutz, 2013. "Multinomial logit models with implicit variable selection," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 7(4), pages 393-416, December.
    15. Gerhard Tutz & Gunther Schauberger, 2015. "A Penalty Approach to Differential Item Functioning in Rasch Models," Psychometrika, Springer;The Psychometric Society, vol. 80(1), pages 21-43, March.
    16. Murat Genç, 2022. "A new double-regularized regression using Liu and lasso regularization," Computational Statistics, Springer, vol. 37(1), pages 159-227, March.
    17. Yen, Tso-Jung & Yen, Yu-Min, 2016. "Structured variable selection via prior-induced hierarchical penalty functions," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 87-103.
    18. Pan, Yuqing & Mai, Qing, 2020. "Efficient computation for differential network analysis with applications to quadratic discriminant analysis," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    19. Fan, Xinyan & Zhang, Qingzhao & Ma, Shuangge & Fang, Kuangnan, 2021. "Conditional score matching for high-dimensional partial graphical models," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
    20. Lee, Wonyul & Liu, Yufeng, 2012. "Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood," Journal of Multivariate Analysis, Elsevier, vol. 111(C), pages 241-255.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:178:y:2023:i:c:s0167947322002006. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.