IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v103y2016icp250-262.html
   My bibliography  Save this article

A relative error-based approach for variable selection

Author

Listed:
  • Hao, Meiling
  • Lin, Yunyuan
  • Zhao, Xingqiu

Abstract

The accelerated failure time model or the multiplicative regression model is well-suited to analyze data with positive responses. For the multiplicative regression model, the authors investigate an adaptive variable selection method via a relative error-based criterion and Lasso-type penalty with desired theoretical properties and computational convenience. With fixed or diverging number of variables in regression model, the resultant estimator achieves the oracle property. An alternating direction method of multipliers algorithm is proposed for computing the regularization paths effectively. A data-driven procedure based on the Bayesian information criterion is used to choose the tuning parameter. The finite-sample performance of the proposed method is examined via simulation studies. An application is illustrated with an analysis of one period of stock returns in Hong Kong Stock Exchange.

Suggested Citation

  • Hao, Meiling & Lin, Yunyuan & Zhao, Xingqiu, 2016. "A relative error-based approach for variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 250-262.
  • Handle: RePEc:eee:csdana:v:103:y:2016:i:c:p:250-262
    DOI: 10.1016/j.csda.2016.05.013
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947316301153
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2016.05.013?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    2. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    3. Gneiting, Tilmann, 2011. "Making and Evaluating Point Forecasts," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 746-762.
    4. Chen, Kani & Guo, Shaojun & Lin, Yuanyuan & Ying, Zhiliang, 2010. "Least Absolute Relative Error Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 105(491), pages 1104-1112.
    5. Wang, Hansheng & Li, Guodong & Jiang, Guohua, 2007. "Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso," Journal of Business & Economic Statistics, American Statistical Association, vol. 25, pages 347-355, July.
    6. Hao Helen Zhang & Wenbin Lu, 2007. "Adaptive Lasso for Cox's proportional hazards model," Biometrika, Biometrika Trust, vol. 94(3), pages 691-703.
    7. Wu, Yujun & Boos, Dennis D. & Stefanski, Leonard A., 2007. "Controlling Variable Selection by the Addition of Pseudovariables," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 235-243, March.
    8. Chen, Kani & Lin, Yuanyuan & Wang, Zhanfeng & Ying, Zhiliang, 2016. "Least product relative error estimation," Journal of Multivariate Analysis, Elsevier, vol. 144(C), pages 91-98.
    9. Stephan Kolassa & Roland Martin, 2011. "Percentage Errors Can Ruin Your Day (and Rolling the Dice Shows How)," Foresight: The International Journal of Applied Forecasting, International Institute of Forecasters, issue 23, pages 21-27, Fall.
    10. Jinfeng Xu & Zhiliang Ying, 2010. "Simultaneous estimation and variable selection in median regression using Lasso-type penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 62(3), pages 487-514, June.
    11. Shen X. & Ye J., 2002. "Adaptive Model Selection," Journal of the American Statistical Association, American Statistical Association, vol. 97, pages 210-221, March.
    12. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    13. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    14. Park, Heungsun & Stefanski, L. A., 1998. "Relative-error prediction," Statistics & Probability Letters, Elsevier, vol. 40(3), pages 227-236, October.
    15. Makridakis, Spyros, 1993. "Accuracy measures: theoretical and practical concerns," International Journal of Forecasting, Elsevier, vol. 9(4), pages 527-529, December.
    16. Hansheng Wang & Runze Li & Chih-Ling Tsai, 2007. "Tuning parameter selectors for the smoothly clipped absolute deviation method," Biometrika, Biometrika Trust, vol. 94(3), pages 553-568.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yinjun Chen & Hao Ming & Hu Yang, 2024. "Efficient variable selection for high-dimensional multiplicative models: a novel LPRE-based approach," Statistical Papers, Springer, vol. 65(6), pages 3713-3737, August.
    2. Huilan Liu & Xiawei Zhang & Huaiqing Hu & Junjie Ma, 2024. "Analysis of the positive response data with the varying coefficient partially nonlinear multiplicative model," Statistical Papers, Springer, vol. 65(5), pages 3063-3092, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xia, Xiaochao & Liu, Zhi & Yang, Hu, 2016. "Regularized estimation for the least absolute relative error models with a diverging number of covariates," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 104-119.
    2. Guang Cheng & Hao Zhang & Zuofeng Shang, 2015. "Sparse and efficient estimation for partial spline models with increasing dimension," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 67(1), pages 93-127, February.
    3. Fan, Rui & Lee, Ji Hyung & Shin, Youngki, 2023. "Predictive quantile regression with mixed roots and increasing dimensions: The ALQR approach," Journal of Econometrics, Elsevier, vol. 237(2).
    4. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    5. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    6. Ping Zeng & Yongyue Wei & Yang Zhao & Jin Liu & Liya Liu & Ruyang Zhang & Jianwei Gou & Shuiping Huang & Feng Chen, 2014. "Variable selection approach for zero-inflated count data via adaptive lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(4), pages 879-894, April.
    7. Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
    8. Wei Wang & Shou‐En Lu & Jerry Q. Cheng & Minge Xie & John B. Kostis, 2022. "Multivariate survival analysis in big data: A divide‐and‐combine approach," Biometrics, The International Biometric Society, vol. 78(3), pages 852-866, September.
    9. Hirose, Kei & Tateishi, Shohei & Konishi, Sadanori, 2013. "Tuning parameter selection in sparse regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 59(C), pages 28-40.
    10. Lee, Eun Ryung & Park, Byeong U., 2012. "Sparse estimation in functional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 1-17.
    11. Lian, Heng & Li, Jianbo & Tang, Xingyu, 2014. "SCAD-penalized regression in additive partially linear proportional hazards models with an ultra-high-dimensional linear part," Journal of Multivariate Analysis, Elsevier, vol. 125(C), pages 50-64.
    12. Arslan, Olcay, 2012. "Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1952-1965.
    13. Yanxin Wang & Qibin Fan & Li Zhu, 2018. "Variable selection and estimation using a continuous approximation to the $$L_0$$ L 0 penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(1), pages 191-214, February.
    14. Hu, Yuao & Lian, Heng, 2013. "Variable selection in a partially linear proportional hazards model with a diverging dimensionality," Statistics & Probability Letters, Elsevier, vol. 83(1), pages 61-69.
    15. Pötscher, Benedikt M. & Schneider, Ulrike, 2007. "On the distribution of the adaptive LASSO estimator," MPRA Paper 6913, University Library of Munich, Germany.
    16. Li, Jianbo & Gu, Minggao, 2012. "Adaptive LASSO for general transformation models with right censored data," Computational Statistics & Data Analysis, Elsevier, vol. 56(8), pages 2583-2597.
    17. Pötscher, Benedikt M., 2007. "Confidence Sets Based on Sparse Estimators Are Necessarily Large," MPRA Paper 5677, University Library of Munich, Germany.
    18. Tang, Linjun & Zhou, Zhangong & Wu, Changchun, 2012. "Weighted composite quantile estimation and variable selection method for censored regression model," Statistics & Probability Letters, Elsevier, vol. 82(3), pages 653-663.
    19. Gaorong Li & Liugen Xue & Heng Lian, 2012. "SCAD-penalised generalised additive models with non-polynomial dimensionality," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(3), pages 681-697.
    20. repec:hum:wpaper:sfb649dp2016-047 is not listed on IDEAS
    21. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:103:y:2016:i:c:p:250-262. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.