IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v184y2021ics0047259x21000312.html
   My bibliography  Save this article

Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace

Author

Listed:
  • Zhang, Hong-Fan

Abstract

The Minimum Average Variance Estimation (MAVE) method and its variants have proven to be effective approaches to the dimension reduction problems. However, as far as we know, using MAVE to estimate the Central Mean Subspace (CMS) with multivariate response receives little attention. This paper proposes a weighted version of MAVE for the CMS with multivariate response. The proposed weighted MAVE method takes account of the correlations among the responses and is a natural extension of the original MAVE method that only targets the univariate response CMS. The algorithm to implement the weighted MAVE method is provided. Asymptotic distribution of the MAVE estimator under the multivariate response setting is also derived. In addition, for the goal of variable selection, we propose incorporating the adaptive group lasso regularization method to induce sparse solutions. We show that the resulting sparse estimator can be calculated in a computationally efficient manner, and the associated BIC-type criterion can consistently select relevant variables. Experimental simulations and a real data analysis demonstrate the effectiveness and usefulness of the proposed methods.

Suggested Citation

  • Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
  • Handle: RePEc:eee:jmvana:v:184:y:2021:i:c:s0047259x21000312
    DOI: 10.1016/j.jmva.2021.104753
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X21000312
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2021.104753?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hilafu, Haileab & Yin, Xiangrong, 2013. "Sufficient dimension reduction in multivariate regressions with categorical predictors," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 139-147.
    2. Yoo, Jae Keun & Cook, R. Dennis, 2008. "Response dimension reduction for the conditional mean in multivariate regression," Computational Statistics & Data Analysis, Elsevier, vol. 53(2), pages 334-343, December.
    3. Yingcun Xia & Howell Tong & W. K. Li & Li‐Xing Zhu, 2002. "An adaptive estimation of dimension reduction space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 363-410, August.
    4. Xia, Yingcun, 2008. "A Multiple-Index Model and Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1631-1640.
    5. Wang, Hansheng & Xia, Yingcun, 2008. "Sliced Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 811-821, June.
    6. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    7. Wang, Qin & Yin, Xiangrong, 2008. "A nonlinear multi-dimensional variable selection method for high dimensional data: Sparse MAVE," Computational Statistics & Data Analysis, Elsevier, vol. 52(9), pages 4512-4520, May.
    8. Coudret, R. & Girard, S. & Saracco, J., 2014. "A new sliced inverse regression method for multivariate response," Computational Statistics & Data Analysis, Elsevier, vol. 77(C), pages 285-299.
    9. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    10. Yoo, Jae Keun, 2008. "Sufficient dimension reduction for the conditional mean with a categorical predictor in multivariate regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1825-1839, September.
    11. Zhang, Yaowu & Zhu, Liping & Ma, Yanyuan, 2017. "Efficient dimension reduction for multivariate response data," Journal of Multivariate Analysis, Elsevier, vol. 155(C), pages 187-199.
    12. Lexin Li, 2007. "Sparse sufficient dimension reduction," Biometrika, Biometrika Trust, vol. 94(3), pages 603-613.
    13. Li, Bing & Wen, Songqiao & Zhu, Lixing, 2008. "On a Projective Resampling Method for Dimension Reduction With Multivariate Responses," Journal of the American Statistical Association, American Statistical Association, vol. 103(483), pages 1177-1186.
    14. Yanyuan Ma & Liping Zhu, 2012. "A Semiparametric Approach to Dimension Reduction," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 168-179, March.
    15. Xia, Yingcun, 2006. "Asymptotic Distributions For Two Estimators Of The Single-Index Model," Econometric Theory, Cambridge University Press, vol. 22(6), pages 1112-1137, December.
    16. Zhu, Liping & Zhong, Wei, 2015. "Estimation and inference on central mean subspace for multivariate response data," Computational Statistics & Data Analysis, Elsevier, vol. 92(C), pages 68-83.
    17. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    18. Lexin Li & Xiangrong Yin, 2008. "Sliced Inverse Regression with Regularizations," Biometrics, The International Biometric Society, vol. 64(1), pages 124-131, March.
    19. Xianyan Chen & Qingcong Yuan & Xiangrong Yin, 2019. "Sufficient dimension reduction via distance covariance with multivariate responses," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 31(2), pages 268-288, April.
    20. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    21. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    22. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    2. Wenquan Cui & Jianjun Xu & Yuehua Wu, 2023. "A new reproducing kernel‐based nonlinear dimension reduction method for survival data," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 50(3), pages 1365-1390, September.
    3. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    4. Zifang Guo & Lexin Li & Wenbin Lu & Bing Li, 2015. "Groupwise Dimension Reduction via Envelope Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1515-1527, December.
    5. Wu, Runxiong & Chen, Xin, 2021. "MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    6. Hilafu, Haileab & Yin, Xiangrong, 2013. "Sufficient dimension reduction in multivariate regressions with categorical predictors," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 139-147.
    7. Yang Liu & Francesca Chiaromonte & Bing Li, 2017. "Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units," Biometrics, The International Biometric Society, vol. 73(2), pages 529-539, June.
    8. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    9. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    10. Zhou Yu & Yuexiao Dong & Li-Xing Zhu, 2016. "Trace Pursuit: A General Framework for Model-Free Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 813-821, April.
    11. Sheng, Wenhui & Yin, Xiangrong, 2013. "Direction estimation in single-index models via distance covariance," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 148-161.
    12. Baek, Seungchul & Hoyoung, Park & Park, Junyong, 2024. "Variable selection using data splitting and projection for principal fitted component models in high dimension," Computational Statistics & Data Analysis, Elsevier, vol. 196(C).
    13. Lu Li & Kai Tan & Xuerong Meggie Wen & Zhou Yu, 2023. "Variable-dependent partial dimension reduction," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(2), pages 521-541, June.
    14. Hyung Park & Thaddeus Tarpey & Eva Petkova & R. Todd Ogden, 2024. "A high-dimensional single-index regression for interactions between treatment and covariates," Statistical Papers, Springer, vol. 65(7), pages 4025-4056, September.
    15. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    16. Cheng, Qing & Zhu, Liping, 2017. "On relative efficiency of principal Hessian directions," Statistics & Probability Letters, Elsevier, vol. 126(C), pages 108-113.
    17. Yunquan Song & Zitong Li & Minglu Fang, 2022. "Robust Variable Selection Based on Penalized Composite Quantile Regression for High-Dimensional Single-Index Models," Mathematics, MDPI, vol. 10(12), pages 1-17, June.
    18. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    19. Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911, November.
    20. Shin, Seung Jun & Artemiou, Andreas, 2017. "Penalized principal logistic regression for sparse sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 48-58.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:184:y:2021:i:c:s0047259x21000312. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.