IDEAS home Printed from https://ideas.repec.org/a/spr/aistmt/v66y2014i2p279-301.html
   My bibliography  Save this article

Testing covariates in high-dimensional regression

Author

Listed:
  • Wei Lan
  • Hansheng Wang
  • Chih-Ling Tsai

Abstract

In a high-dimensional linear regression model, we propose a new procedure for testing statistical significance of a subset of regression coefficients. Specifically, we employ the partial covariances between the response variable and the tested covariates to obtain a test statistic. The resulting test is applicable even if the predictor dimension is much larger than the sample size. Under the null hypothesis, together with boundedness and moment conditions on the predictors, we show that the proposed test statistic is asymptotically standard normal, which is further supported by Monte Carlo experiments. A similar test can be extended to generalized linear models. The practical usefulness of the test is illustrated via an empirical example on paid search advertising. Copyright The Institute of Statistical Mathematics, Tokyo 2014

Suggested Citation

  • Wei Lan & Hansheng Wang & Chih-Ling Tsai, 2014. "Testing covariates in high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(2), pages 279-301, April.
  • Handle: RePEc:spr:aistmt:v:66:y:2014:i:2:p:279-301
    DOI: 10.1007/s10463-013-0414-0
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10463-013-0414-0
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10463-013-0414-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zhong, Ping-Shou & Chen, Song Xi, 2011. "Tests for High-Dimensional Regression Coefficients With Factorial Designs," Journal of the American Statistical Association, American Statistical Association, vol. 106(493), pages 260-274.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    3. Chen, Song Xi & Zhang, Li-Xin & Zhong, Ping-Shou, 2010. "Tests for High-Dimensional Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 105(490), pages 810-819.
    4. Wang, Hansheng, 2009. "Forward Regression for Ultra-High Dimensional Variable Screening," Journal of the American Statistical Association, American Statistical Association, vol. 104(488), pages 1512-1524.
    5. Fan, Jianqing & Fan, Yingying & Lv, Jinchi, 2008. "High dimensional covariance matrix estimation using a factor model," Journal of Econometrics, Elsevier, vol. 147(1), pages 186-197, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ma, Yingying & Lan, Wei & Wang, Hansheng, 2015. "Testing predictor significance with ultra high dimensional multivariate responses," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 275-286.
    2. Lan, Wei & Ding, Yue & Fang, Zheng & Fang, Kuangnan, 2016. "Testing covariates in high dimension linear regression with latent factors," Journal of Multivariate Analysis, Elsevier, vol. 144(C), pages 25-37.
    3. Yata, Kazuyoshi & Aoshima, Makoto, 2016. "High-dimensional inference on covariance structures via the extended cross-data-matrix methodology," Journal of Multivariate Analysis, Elsevier, vol. 151(C), pages 151-166.
    4. Rui Wang & Xingzhong Xu, 2021. "A Bayesian-motivated test for high-dimensional linear regression models with fixed design matrix," Statistical Papers, Springer, vol. 62(4), pages 1821-1852, August.
    5. Bin Guo & Song Xi Chen, 2016. "Tests for high dimensional generalized linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(5), pages 1079-1102, November.
    6. Ma, Yingying & Lan, Wei & Wang, Hansheng, 2015. "A high dimensional two-sample test under a low dimensional factor structure," Journal of Multivariate Analysis, Elsevier, vol. 140(C), pages 162-170.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wei Lan & Ronghua Luo & Chih-Ling Tsai & Hansheng Wang & Yunhong Yang, 2015. "Testing the Diagonality of a Large Covariance Matrix in a Regression Setting," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 33(1), pages 76-86, January.
    2. Lan, Wei & Zhong, Ping-Shou & Li, Runze & Wang, Hansheng & Tsai, Chih-Ling, 2016. "Testing a single regression coefficient in high dimensional linear models," Journal of Econometrics, Elsevier, vol. 195(1), pages 154-168.
    3. He, Jing & Chen, Song Xi, 2016. "Testing super-diagonal structure in high dimensional covariance matrices," Journal of Econometrics, Elsevier, vol. 194(2), pages 283-297.
    4. Ma, Yingying & Lan, Wei & Wang, Hansheng, 2015. "Testing predictor significance with ultra high dimensional multivariate responses," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 275-286.
    5. Zang, Yangguang & Zhang, Sanguo & Li, Qizhai & Zhang, Qingzhao, 2016. "Jackknife empirical likelihood test for high-dimensional regression coefficients," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 302-316.
    6. Wang, Siyang & Cui, Hengjian, 2015. "A new test for part of high dimensional regression coefficients," Journal of Multivariate Analysis, Elsevier, vol. 137(C), pages 187-203.
    7. Shan Luo & Zehua Chen, 2014. "Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1229-1240, September.
    8. Toshio Honda, 2021. "The de-biased group Lasso estimation for varying coefficient models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(1), pages 3-29, February.
    9. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    10. Zhang, Tonglin, 2024. "Variables selection using L0 penalty," Computational Statistics & Data Analysis, Elsevier, vol. 190(C).
    11. Bai, Jushan & Liao, Yuan, 2016. "Efficient estimation of approximate factor models via penalized maximum likelihood," Journal of Econometrics, Elsevier, vol. 191(1), pages 1-18.
    12. Qiu, Yumou & Chen, Songxi, 2012. "Test for Bandedness of High Dimensional Covariance Matrices with Bandwidth Estimation," MPRA Paper 46242, University Library of Munich, Germany.
    13. Zhao, Bangxin & Liu, Xin & He, Wenqing & Yi, Grace Y., 2021. "Dynamic tilted current correlation for high dimensional variable screening," Journal of Multivariate Analysis, Elsevier, vol. 182(C).
    14. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    15. Yoshimasa Uematsu & Takashi Yamagata, 2019. "Estimation of Weak Factor Models," DSSR Discussion Papers 96, Graduate School of Economics and Management, Tohoku University.
    16. Zhang, Shucong & Zhou, Yong, 2018. "Variable screening for ultrahigh dimensional heterogeneous data via conditional quantile correlations," Journal of Multivariate Analysis, Elsevier, vol. 165(C), pages 1-13.
    17. Gong, Siliang & Zhang, Kai & Liu, Yufeng, 2018. "Efficient test-based variable selection for high-dimensional linear models," Journal of Multivariate Analysis, Elsevier, vol. 166(C), pages 17-31.
    18. Dai, Linlin & Chen, Kani & Sun, Zhihua & Liu, Zhenqiu & Li, Gang, 2018. "Broken adaptive ridge regression and its asymptotic properties," Journal of Multivariate Analysis, Elsevier, vol. 168(C), pages 334-351.
    19. Ruggieri, Eric & Lawrence, Charles E., 2012. "On efficient calculations for Bayesian variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1319-1332.
    20. Liming Wang & Xingxiang Li & Xiaoqing Wang & Peng Lai, 2022. "Unified mean-variance feature screening for ultrahigh-dimensional regression," Computational Statistics, Springer, vol. 37(4), pages 1887-1918, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:aistmt:v:66:y:2014:i:2:p:279-301. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.