IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v198y2024ics016794732400077x.html
   My bibliography  Save this article

Consistent skinny Gibbs in probit regression

Author

Listed:
  • Ouyang, Jiarong
  • Cao, Xuan

Abstract

Spike and slab priors have emerged as effective and computationally scalable tools for Bayesian variable selection in high-dimensional linear regression. However, the crucial model selection consistency and efficient computational strategies using spike and slab priors in probit regression have rarely been investigated. A hierarchical probit model with continuous spike and slab priors over regression coefficients is considered, and a highly scalable Gibbs sampler with a computational complexity that grows only linearly in the dimension of predictors is proposed. Specifically, the “Skinny Gibbs” algorithm is adapted to the setting of probit and negative binomial regression and model selection consistency for the proposed method under probit model is established, when the number of covariates is allowed to grow much larger than the sample size. Through simulation studies, the method is shown to achieve superior empirical performance compared with other state-of-the art methods. Gene expression data from 51 asthmatic and 44 non-asthmatic samples are analyzed and the performance for predicting asthma using the proposed approach is compared with existing approaches.

Suggested Citation

  • Ouyang, Jiarong & Cao, Xuan, 2024. "Consistent skinny Gibbs in probit regression," Computational Statistics & Data Analysis, Elsevier, vol. 198(C).
  • Handle: RePEc:eee:csdana:v:198:y:2024:i:c:s016794732400077x
    DOI: 10.1016/j.csda.2024.107993
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S016794732400077X
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2024.107993?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Faming Liang & Qifan Song & Kai Yu, 2013. "Bayesian Subset Modeling for High-Dimensional Generalized Linear Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(502), pages 589-606, June.
    2. Kshitij Khare & Sang-Yun Oh & Bala Rajaratnam, 2015. "A convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 77(4), pages 803-825, September.
    3. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    4. Veronika Ročková & Edward I. George, 2018. "The Spike-and-Slab LASSO," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 431-444, January.
    5. Valen E. Johnson & David Rossell, 2012. "Bayesian Model Selection in High-Dimensional Settings," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(498), pages 649-660, June.
    6. Nicholas G. Polson & James G. Scott & Jesse Windle, 2013. "Bayesian Inference for Logistic Models Using Pólya--Gamma Latent Variables," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(504), pages 1339-1349, December.
    7. Naveen N. Narisetty & Juan Shen & Xuming He, 2019. "Skinny Gibbs: A Consistent and Scalable Gibbs Sampler for Model Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(527), pages 1205-1217, July.
    8. Kyoungjae Lee & Xuan Cao, 2021. "Bayesian group selection in logistic regression with application to MRI data analysis," Biometrics, The International Biometric Society, vol. 77(2), pages 391-400, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wang, Jia & Cai, Xizhen & Li, Runze, 2021. "Variable selection for partially linear models via Bayesian subset modeling with diffusing prior," Journal of Multivariate Analysis, Elsevier, vol. 183(C).
    2. Qifan Song & Guang Cheng, 2020. "Bayesian Fusion Estimation via t Shrinkage," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 353-385, August.
    3. Li, Cheng & Jiang, Wenxin, 2016. "On oracle property and asymptotic validity of Bayesian generalized method of moments," Journal of Multivariate Analysis, Elsevier, vol. 145(C), pages 132-147.
    4. Shi, Guiling & Lim, Chae Young & Maiti, Tapabrata, 2019. "Bayesian model selection for generalized linear models using non-local priors," Computational Statistics & Data Analysis, Elsevier, vol. 133(C), pages 285-296.
    5. Xueying Tang & Xiaofan Xu & Malay Ghosh & Prasenjit Ghosh, 2018. "Bayesian Variable Selection and Estimation Based on Global-Local Shrinkage Priors," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 80(2), pages 215-246, August.
    6. Qifan Song & Faming Liang, 2015. "High-Dimensional Variable Selection With Reciprocal L 1 -Regularization," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1607-1620, December.
    7. Sierra A. Bainter & Thomas G. McCauley & Mahmoud M. Fahmy & Zachary T. Goodman & Lauren B. Kupis & J. Sunil Rao, 2023. "Comparing Bayesian Variable Selection to Lasso Approaches for Applications in Psychology," Psychometrika, Springer;The Psychometric Society, vol. 88(3), pages 1032-1055, September.
    8. Byrd, Michael & Nghiem, Linh H. & McGee, Monnie, 2021. "Bayesian regularization of Gaussian graphical models with measurement error," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    9. Gonzalo García-Donato & María Eugenia Castellanos & Alicia Quirós, 2021. "Bayesian Variable Selection with Applications in Health Sciences," Mathematics, MDPI, vol. 9(3), pages 1-16, January.
    10. Minerva Mukhopadhyay & David B. Dunson, 2020. "Targeted Random Projection for Prediction From High-Dimensional Features," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(532), pages 1998-2010, December.
    11. Hu, Jianhua & Liu, Xiaoqian & Liu, Xu & Xia, Ningning, 2022. "Some aspects of response variable selection and estimation in multivariate linear regression," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    12. Posch, Konstantin & Arbeiter, Maximilian & Pilz, Juergen, 2020. "A novel Bayesian approach for variable selection in linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    13. Tanin Sirimongkolkasem & Reza Drikvandi, 2019. "On Regularisation Methods for Analysis of High Dimensional Data," Annals of Data Science, Springer, vol. 6(4), pages 737-763, December.
    14. Mogliani, Matteo & Simoni, Anna, 2021. "Bayesian MIDAS penalized regressions: Estimation, selection, and prediction," Journal of Econometrics, Elsevier, vol. 222(1), pages 833-860.
    15. Kshitij Khare & Malay Ghosh, 2022. "MCMC Convergence for Global-Local Shrinkage Priors," Journal of Quantitative Economics, Springer;The Indian Econometric Society (TIES), vol. 20(1), pages 211-234, September.
    16. Byron Botha & Rulof Burger & Kevin Kotzé & Neil Rankin & Daan Steenkamp, 2023. "Big data forecasting of South African inflation," Empirical Economics, Springer, vol. 65(1), pages 149-188, July.
    17. Banerjee, Sayantan, 2022. "Horseshoe shrinkage methods for Bayesian fusion estimation," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
    18. Shiqiang Jin & Gyuhyeong Goh, 2021. "Bayesian selection of best subsets via hybrid search," Computational Statistics, Springer, vol. 36(3), pages 1991-2007, September.
    19. Jingying Yang & Guishu Bai & Mei Yan, 2023. "Minimum Residual Sum of Squares Estimation Method for High-Dimensional Partial Correlation Coefficient," Mathematics, MDPI, vol. 11(20), pages 1-22, October.
    20. Zhang, Chun-Xia & Xu, Shuang & Zhang, Jiang-She, 2019. "A novel variational Bayesian method for variable selection in logistic regression models," Computational Statistics & Data Analysis, Elsevier, vol. 133(C), pages 1-19.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:198:y:2024:i:c:s016794732400077x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.