IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v77y2014icp223-232.html
   My bibliography  Save this article

Screening active factors in supersaturated designs

Author

Listed:
  • Das, Ujjwal
  • Gupta, Sudhir
  • Gupta, Shuva

Abstract

Identification of active factors in supersaturated designs (SSDs) has been the subject of much recent study. Although several methods have been previously proposed, a solution to the problem beyond one or two active factors still seems to be unsatisfactory. The smoothly clipped absolute deviation (SCAD) penalty function for variable selection has nice theoretical properties, but due to its nonconvex nature, it poses computational issues in model fitting. As a result, so far it has not shown much promise for SSDs. Another issue regarding its inefficiency, particularly for SSDs, has been the method used for choosing the SCAD sparsity tuning parameter. The selection of the SCAD sparsity tuning parameter using the AIC and BIC information criteria, generalized cross-validation, and a recently proposed method based on the norm of the error in the solution of systems of linear equations are investigated. This is performed in conjunction with a recently developed more efficient algorithm for implementing the SCAD penalty. The small sample bias-corrected cAIC is found to yield a model size closer to the true model size. Results of the numerical study and real data analyses reveal that the SCAD is a valuable tool for identifying active factors in SSDs.

Suggested Citation

  • Das, Ujjwal & Gupta, Sudhir & Gupta, Shuva, 2014. "Screening active factors in supersaturated designs," Computational Statistics & Data Analysis, Elsevier, vol. 77(C), pages 223-232.
  • Handle: RePEc:eee:csdana:v:77:y:2014:i:c:p:223-232
    DOI: 10.1016/j.csda.2014.02.023
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947314000632
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2014.02.023?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    2. Li, Runze & Lin, Dennis K. J., 2002. "Data analysis in supersaturated designs," Statistics & Probability Letters, Elsevier, vol. 59(2), pages 135-144, September.
    3. Kim, Yongdai & Choi, Hosik & Oh, Hee-Seok, 2008. "Smoothly Clipped Absolute Deviation on High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1665-1673.
    4. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    5. Marley, Christopher J. & Woods, David C., 2010. "A comparison of design and model selection methods for supersaturated experiments," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3158-3167, December.
    6. Edwards, David J. & Mee, Robert W., 2011. "Supersaturated designs: Are our results significant?," Computational Statistics & Data Analysis, Elsevier, vol. 55(9), pages 2652-2664, September.
    7. Hansheng Wang & Runze Li & Chih-Ling Tsai, 2007. "Tuning parameter selectors for the smoothly clipped absolute deviation method," Biometrika, Biometrika Trust, vol. 94(3), pages 553-568.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yanxin Wang & Qibin Fan & Li Zhu, 2018. "Variable selection and estimation using a continuous approximation to the $$L_0$$ L 0 penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(1), pages 191-214, February.
    2. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    3. Wang, Tao & Xu, Pei-Rong & Zhu, Li-Xing, 2012. "Non-convex penalized estimation in high-dimensional models with single-index structure," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 221-235.
    4. Joel L. Horowitz, 2015. "Variable selection and estimation in high-dimensional models," CeMMAP working papers 35/15, Institute for Fiscal Studies.
    5. Hirose, Kei & Tateishi, Shohei & Konishi, Sadanori, 2013. "Tuning parameter selection in sparse regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 59(C), pages 28-40.
    6. Wu, Tong Tong & He, Xin, 2012. "Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data," Computational Statistics & Data Analysis, Elsevier, vol. 56(1), pages 25-33, January.
    7. D.M. Sakate & D.N. Kashid, 2014. "Variable selection via penalized minimum φ-divergence estimation in logistic regression," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(6), pages 1233-1246, June.
    8. Gaorong Li & Liugen Xue & Heng Lian, 2012. "SCAD-penalised generalised additive models with non-polynomial dimensionality," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(3), pages 681-697.
    9. Yunxiao Chen & Xiaoou Li & Jingchen Liu & Zhiliang Ying, 2017. "Regularized Latent Class Analysis with Application in Cognitive Diagnosis," Psychometrika, Springer;The Psychometric Society, vol. 82(3), pages 660-692, September.
    10. Li, Xinjue & Zboňáková, Lenka & Wang, Weining & Härdle, Wolfgang Karl, 2019. "Combining Penalization and Adaption in High Dimension with Application in Bond Risk Premia Forecasting," IRTG 1792 Discussion Papers 2019-030, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    11. Zhixuan Fu & Shuangge Ma & Haiqun Lin & Chirag R. Parikh & Bingqing Zhou, 2017. "Penalized Variable Selection for Multi-center Competing Risks Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 9(2), pages 379-405, December.
    12. Wei Qian & Yuhong Yang, 2013. "Model selection via standard error adjusted adaptive lasso," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(2), pages 295-318, April.
    13. Kwon, Sunghoon & Choi, Hosik & Kim, Yongdai, 2011. "Quadratic approximation on SCAD penalized estimation," Computational Statistics & Data Analysis, Elsevier, vol. 55(1), pages 421-428, January.
    14. Wei Wang & Shou‐En Lu & Jerry Q. Cheng & Minge Xie & John B. Kostis, 2022. "Multivariate survival analysis in big data: A divide‐and‐combine approach," Biometrics, The International Biometric Society, vol. 78(3), pages 852-866, September.
    15. Joel L. Horowitz, 2015. "Variable selection and estimation in high‐dimensional models," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 48(2), pages 389-407, May.
    16. Matsui, Hidetoshi, 2014. "Variable and boundary selection for functional data via multiclass logistic regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 78(C), pages 176-185.
    17. Lian, Heng & Li, Jianbo & Tang, Xingyu, 2014. "SCAD-penalized regression in additive partially linear proportional hazards models with an ultra-high-dimensional linear part," Journal of Multivariate Analysis, Elsevier, vol. 125(C), pages 50-64.
    18. Lee, Sangin & Kwon, Sunghoon & Kim, Yongdai, 2016. "A modified local quadratic approximation algorithm for penalized optimization problems," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 275-286.
    19. Chen, Yunxiao & Li, Xiaoou & Liu, Jingchen & Ying, Zhiliang, 2017. "Regularized latent class analysis with application in cognitive diagnosis," LSE Research Online Documents on Economics 103182, London School of Economics and Political Science, LSE Library.
    20. Joel L. Horowitz, 2015. "Variable selection and estimation in high-dimensional models," Canadian Journal of Economics, Canadian Economics Association, vol. 48(2), pages 389-407, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:77:y:2014:i:c:p:223-232. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.