IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v192y2024ics0167947323002104.html
   My bibliography  Save this article

A fast trans-lasso algorithm with penalized weighted score function

Author

Listed:
  • Fan, Xianqiu
  • Cheng, Jun
  • Wang, Hailing
  • Zhang, Bin
  • Chen, Zhenzhen

Abstract

An efficient transfer learning algorithm for high-dimensional sparse logistic regression models is proposed using penalized weighted score function based on square root Lasso, which intends to prespecify the tuning parameter. Three different choices of the tuning parameter are considered in the case of fixed design matrix. With a novel weight construction, the estimator of the regression vector is showed to be consistent when the inequality with respect to the Karush-Kuhn-Tucker (KKT) optimality conditions holds with high probability and the sparsity assumption for the regression vectors is required. There is information from source data to fit target data such that with high probability tending to 1-α, which is sharper than the corresponding probability bound without using the auxiliary samples, the KKT optimality conditions hold for the asymptotic choice. To detect which sources are transferable, an efficient data-driven method is proposed, which helps avoid negative transfer in practice. Simulation studies are carried out to demonstrate the numerical performance of the proposed procedure and their superiority over some existing methods. The procedures are also illustrated by analyzing the China Migrants Dynamic Survey dataset with binary outcomes concerning the associations among different provinces.

Suggested Citation

  • Fan, Xianqiu & Cheng, Jun & Wang, Hailing & Zhang, Bin & Chen, Zhenzhen, 2024. "A fast trans-lasso algorithm with penalized weighted score function," Computational Statistics & Data Analysis, Elsevier, vol. 192(C).
  • Handle: RePEc:eee:csdana:v:192:y:2024:i:c:s0167947323002104
    DOI: 10.1016/j.csda.2023.107899
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947323002104
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2023.107899?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. A. Belloni & V. Chernozhukov & L. Wang, 2011. "Square-root lasso: pivotal recovery of sparse signals via conic programming," Biometrika, Biometrika Trust, vol. 98(4), pages 791-806.
    2. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    3. Tingni Sun & Cun-Hui Zhang, 2012. "Scaled sparse linear regression," Biometrika, Biometrika Trust, vol. 99(4), pages 879-898.
    4. Yuan Jiang & Yunxiao He & Heping Zhang, 2016. "Variable Selection With Prior Information for Generalized Linear Models via the Prior LASSO Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(513), pages 355-376, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
    2. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    3. Victor Chernozhukov & Christian Hansen & Yuan Liao, 2015. "A lava attack on the recovery of sums of dense and sparse signals," CeMMAP working papers CWP56/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    4. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    5. Alexandre Belloni & Victor Chernozhukov & Lie Wang, 2013. "Pivotal estimation via square-root lasso in nonparametric regression," CeMMAP working papers CWP62/13, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. Anindya Bhadra & Jyotishka Datta & Nicholas G. Polson & Brandon T. Willard, 2020. "Global-Local Mixtures: A Unifying Framework," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 426-447, August.
    7. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    8. Beyhum, Jad, 2019. "Inference robust to outliers with L1‐norm penalization," TSE Working Papers 19-1032, Toulouse School of Economics (TSE).
    9. Zemin Zheng & Jinchi Lv & Wei Lin, 2021. "Nonsparse Learning with Latent Variables," Operations Research, INFORMS, vol. 69(1), pages 346-359, January.
    10. Guo, Zijian & Kang, Hyunseung & Cai, T. Tony & Small, Dylan S., 2018. "Testing endogeneity with high dimensional covariates," Journal of Econometrics, Elsevier, vol. 207(1), pages 175-187.
    11. Alexis Derumigny, 2017. "Improved bounds for Square-Root Lasso and Square-Root Slope," Working Papers 2017-53, Center for Research in Economics and Statistics.
    12. Jad Beyhum, 2020. "Inference robust to outliers with L1‐norm penalization," Post-Print hal-03235868, HAL.
    13. Zhang Haixiang & Zheng Yinan & Yoon Grace & Zhang Zhou & Gao Tao & Joyce Brian & Zhang Wei & Schwartz Joel & Vokonas Pantel & Colicino Elena & Baccarelli Andrea & Hou Lifang & Liu Lei, 2017. "Regularized estimation in sparse high-dimensional multivariate regression, with application to a DNA methylation study," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 16(3), pages 159-171, August.
    14. Sermpinis, Georgios & Tsoukas, Serafeim & Zhang, Ping, 2018. "Modelling market implied ratings using LASSO variable selection techniques," Journal of Empirical Finance, Elsevier, vol. 48(C), pages 19-35.
    15. Mohamed Ouhourane & Yi Yang & Andréa L. Benedet & Karim Oualkacha, 2022. "Group penalized quantile regression," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 31(3), pages 495-529, September.
    16. Xie, Jichun & Kang, Jian, 2017. "High-dimensional tests for functional networks of brain anatomic regions," Journal of Multivariate Analysis, Elsevier, vol. 156(C), pages 70-88.
    17. Mingrui Zhong & Zanhua Yin & Zhichao Wang, 2023. "Variable Selection for Sparse Logistic Regression with Grouped Variables," Mathematics, MDPI, vol. 11(24), pages 1-21, December.
    18. Wanling Xie & Hu Yang, 2023. "Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 107(3), pages 469-507, September.
    19. Chen, Shunjie & Yang, Sijia & Wang, Pei & Xue, Liugen, 2023. "Two-stage penalized algorithms via integrating prior information improve gene selection from omics data," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 628(C).
    20. Pun, Chi Seng & Hadimaja, Matthew Zakharia, 2021. "A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:192:y:2024:i:c:s0167947323002104. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.