IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v96y2016icp104-119.html
   My bibliography  Save this article

Regularized estimation for the least absolute relative error models with a diverging number of covariates

Author

Listed:
  • Xia, Xiaochao
  • Liu, Zhi
  • Yang, Hu

Abstract

This paper considers the variable selection for the least absolute relative error (LARE) model, where the dimension of model, pn, is allowed to increase with the sample size n. Under some mild regular conditions, we establish the oracle properties, including the consistency of model selection and the asymptotic normality for the estimator of non-zero parameter. An adaptive weighting scheme is considered in the regularization, which admits the adaptive Lasso, SCAD and MCP penalties by linear approximation. The theoretical results allow the dimension diverging at the rate pn=o(n1/2) for the consistency and pn=o(n1/3) for the asymptotic normality. Furthermore, a practical variable selection procedure based on least squares approximation (LSA) is studied and its oracle property is also provided. Numerical studies are carried out to evaluate the performance of the proposed approaches.

Suggested Citation

  • Xia, Xiaochao & Liu, Zhi & Yang, Hu, 2016. "Regularized estimation for the least absolute relative error models with a diverging number of covariates," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 104-119.
  • Handle: RePEc:eee:csdana:v:96:y:2016:i:c:p:104-119
    DOI: 10.1016/j.csda.2015.10.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947315002674
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2015.10.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    2. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    3. Harrison, David Jr. & Rubinfeld, Daniel L., 1978. "Hedonic housing prices and the demand for clean air," Journal of Environmental Economics and Management, Elsevier, vol. 5(1), pages 81-102, March.
    4. Chen, Kani & Guo, Shaojun & Lin, Yuanyuan & Ying, Zhiliang, 2010. "Least Absolute Relative Error Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 105(491), pages 1104-1112.
    5. Wang, Hansheng & Li, Guodong & Jiang, Guohua, 2007. "Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso," Journal of Business & Economic Statistics, American Statistical Association, vol. 25, pages 347-355, July.
    6. Zhouping Li & Yuanyuan Lin & Guoliang Zhou & Wang Zhou, 2014. "Empirical likelihood for least absolute relative error regression," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 23(1), pages 86-99, March.
    7. Jianqing Fan & Yunbei Ma & Wei Dai, 2014. "Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Varying Coefficient Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1270-1284, September.
    8. Zhezhen Jin, 2003. "Rank-based inference for the accelerated failure time model," Biometrika, Biometrika Trust, vol. 90(2), pages 341-353, June.
    9. Jian Huang & Shuangge Ma & Huiliang Xie, 2006. "Regularized Estimation in the Accelerated Failure Time Model with High-Dimensional Covariates," Biometrics, The International Biometric Society, vol. 62(3), pages 813-820, September.
    10. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    11. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    12. Park, Heungsun & Stefanski, L. A., 1998. "Relative-error prediction," Statistics & Probability Letters, Elsevier, vol. 40(3), pages 227-236, October.
    13. Lan Wang & Yichao Wu & Runze Li, 2012. "Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 214-222, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tianzhen Wang & Haixiang Zhang, 2022. "Optimal subsampling for multiplicative regression with massive data," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 76(4), pages 418-449, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hao, Meiling & Lin, Yunyuan & Zhao, Xingqiu, 2016. "A relative error-based approach for variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 250-262.
    2. Fan, Rui & Lee, Ji Hyung & Shin, Youngki, 2023. "Predictive quantile regression with mixed roots and increasing dimensions: The ALQR approach," Journal of Econometrics, Elsevier, vol. 237(2).
    3. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    4. T. Cai & J. Huang & L. Tian, 2009. "Regularized Estimation for the Accelerated Failure Time Model," Biometrics, The International Biometric Society, vol. 65(2), pages 394-404, June.
    5. Zhang, Shucong & Zhou, Yong, 2018. "Variable screening for ultrahigh dimensional heterogeneous data via conditional quantile correlations," Journal of Multivariate Analysis, Elsevier, vol. 165(C), pages 1-13.
    6. Lenka Zbonakova & Wolfgang Karl Härdle & Weining Wang, 2016. "Time Varying Quantile Lasso," SFB 649 Discussion Papers SFB649DP2016-047, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    7. Caner, Mehmet & Fan, Qingliang, 2015. "Hybrid generalized empirical likelihood estimators: Instrument selection with adaptive lasso," Journal of Econometrics, Elsevier, vol. 187(1), pages 256-274.
    8. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    9. Guang Cheng & Hao Zhang & Zuofeng Shang, 2015. "Sparse and efficient estimation for partial spline models with increasing dimension," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 67(1), pages 93-127, February.
    10. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    11. Abdul Wahid & Dost Muhammad Khan & Ijaz Hussain, 2017. "Robust Adaptive Lasso method for parameter’s estimation and variable selection in high-dimensional sparse models," PLOS ONE, Public Library of Science, vol. 12(8), pages 1-17, August.
    12. Song, Yunquan & Liang, Xijun & Zhu, Yanji & Lin, Lu, 2021. "Robust variable selection with exponential squared loss for the spatial autoregressive model," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    13. Kean Ming Tan & Lan Wang & Wen‐Xin Zhou, 2022. "High‐dimensional quantile regression: Convolution smoothing and concave regularization," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(1), pages 205-233, February.
    14. Arslan, Olcay, 2012. "Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1952-1965.
    15. Guo-Liang Tian & Mingqiu Wang & Lixin Song, 2014. "Variable selection in the high-dimensional continuous generalized linear model with current status data," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(3), pages 467-483, March.
    16. Guo, Chaohui & Lv, Jing & Wu, Jibo, 2021. "Composite quantile regression for ultra-high dimensional semiparametric model averaging," Computational Statistics & Data Analysis, Elsevier, vol. 160(C).
    17. Quynh Van Nong & Chi Tim Ng, 2021. "Clustering of subsample means based on pairwise L1 regularized empirical likelihood," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(1), pages 135-174, February.
    18. Fan, Zengyan & Lian, Heng, 2018. "Quantile regression for additive coefficient models in high dimensions," Journal of Multivariate Analysis, Elsevier, vol. 164(C), pages 54-64.
    19. Lina Liao & Cheolwoo Park & Hosik Choi, 2019. "Penalized expectile regression: an alternative to penalized quantile regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(2), pages 409-438, April.
    20. Ping Zeng & Yongyue Wei & Yang Zhao & Jin Liu & Liya Liu & Ruyang Zhang & Jianwei Gou & Shuiping Huang & Feng Chen, 2014. "Variable selection approach for zero-inflated count data via adaptive lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(4), pages 879-894, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:96:y:2016:i:c:p:104-119. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.