IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v192y2024ics0167947323002128.html
   My bibliography  Save this article

Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression

Author

Listed:
  • Wu, Xiaofei
  • Ming, Hao
  • Zhang, Zhimin
  • Cui, Zhenyu

Abstract

In this paper, we consider a quantile fused LASSO regression model that combines quantile regression loss with the fused LASSO penalty. Intuitively, this model offers robustness to outliers, thanks to the quantile regression, while also effectively recovering sparse and block coefficients through the fused LASSO penalty. To adapt our proposed method for ultrahigh dimensional datasets, we introduce an iterative algorithm based on the multi-block alternating direction method of multipliers (ADMM). Moreover, we demonstrate the global convergence of the algorithm and derive comparable convergence rates. Importantly, our ADMM algorithm can be easily applied to solve various existing fused LASSO models. In terms of theoretical analysis, we establish that the quantile fused LASSO can achieve near oracle properties with a practical penalty parameter, and additionally, it possesses a sure screening property under a wide class of error distributions. The numerical experimental results support our claims, showing that the quantile fused LASSO outperforms existing fused regression models in robustness, particularly under heavy-tailed distributions.

Suggested Citation

  • Wu, Xiaofei & Ming, Hao & Zhang, Zhimin & Cui, Zhenyu, 2024. "Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression," Computational Statistics & Data Analysis, Elsevier, vol. 192(C).
  • Handle: RePEc:eee:csdana:v:192:y:2024:i:c:s0167947323002128
    DOI: 10.1016/j.csda.2023.107901
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947323002128
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2023.107901?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ye, Gui-Bo & Xie, Xiaohui, 2011. "Split Bregman method for large scale fused Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1552-1569, April.
    2. Koenker,Roger, 2005. "Quantile Regression," Cambridge Books, Cambridge University Press, number 9780521845731, January.
    3. Xiaofei Wu & Rongmei Liang & Hu Yang, 2022. "Penalized and constrained LAD estimation in fixed and high dimension," Statistical Papers, Springer, vol. 63(1), pages 53-95, February.
    4. A. Belloni & V. Chernozhukov & L. Wang, 2011. "Square-root lasso: pivotal recovery of sparse signals via conic programming," Biometrika, Biometrika Trust, vol. 98(4), pages 791-806.
    5. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    6. Wang, Lie, 2013. "The L1 penalized LAD estimator for high dimensional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 120(C), pages 135-151.
    7. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108, February.
    8. Xiu, Xianchao & Liu, Wanquan & Li, Ling & Kong, Lingchen, 2019. "Alternating direction method of multipliers for nonconvex fused regression problems," Computational Statistics & Data Analysis, Elsevier, vol. 136(C), pages 59-71.
    9. Lan Wang & Yichao Wu & Runze Li, 2012. "Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 214-222, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xiaofei Wu & Rongmei Liang & Hu Yang, 2022. "Penalized and constrained LAD estimation in fixed and high dimension," Statistical Papers, Springer, vol. 63(1), pages 53-95, February.
    2. Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
    3. Alexandre Belloni & Victor Chernozhukov & Kengo Kato, 2019. "Valid Post-Selection Inference in High-Dimensional Approximately Sparse Quantile Regression Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(526), pages 749-758, April.
    4. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    5. Yao, Fang & Sue-Chee, Shivon & Wang, Fan, 2017. "Regularized partially functional quantile regression," Journal of Multivariate Analysis, Elsevier, vol. 156(C), pages 39-56.
    6. Ismail Shah & Hina Naz & Sajid Ali & Amani Almohaimeed & Showkat Ahmad Lone, 2023. "A New Quantile-Based Approach for LASSO Estimation," Mathematics, MDPI, vol. 11(6), pages 1-13, March.
    7. Mohamed Ouhourane & Yi Yang & Andréa L. Benedet & Karim Oualkacha, 2022. "Group penalized quantile regression," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 31(3), pages 495-529, September.
    8. Jiang, Rong & Qian, Wei-Min, 2016. "Quantile regression for single-index-coefficient regression models," Statistics & Probability Letters, Elsevier, vol. 110(C), pages 305-317.
    9. Laura Freijeiro‐González & Manuel Febrero‐Bande & Wenceslao González‐Manteiga, 2022. "A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates," International Statistical Review, International Statistical Institute, vol. 90(1), pages 118-145, April.
    10. Jiang, He & Luo, Shihua & Dong, Yao, 2021. "Simultaneous feature selection and clustering based on square root optimization," European Journal of Operational Research, Elsevier, vol. 289(1), pages 214-231.
    11. Jeon, Jong-June & Kwon, Sunghoon & Choi, Hosik, 2017. "Homogeneity detection for the high-dimensional generalized linear model," Computational Statistics & Data Analysis, Elsevier, vol. 114(C), pages 61-74.
    12. Chen, Le-Yu & Lee, Sokbae, 2023. "Sparse quantile regression," Journal of Econometrics, Elsevier, vol. 235(2), pages 2195-2217.
    13. Park, Seyoung & Kim, Hyunjin & Lee, Eun Ryung, 2023. "Regional quantile regression for multiple responses," Computational Statistics & Data Analysis, Elsevier, vol. 188(C).
    14. He, Qianchuan & Kong, Linglong & Wang, Yanhua & Wang, Sijian & Chan, Timothy A. & Holland, Eric, 2016. "Regularized quantile regression under heterogeneous sparsity with application to quantitative genetic traits," Computational Statistics & Data Analysis, Elsevier, vol. 95(C), pages 222-239.
    15. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    16. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    17. Chuliá, Helena & Garrón, Ignacio & Uribe, Jorge M., 2024. "Daily growth at risk: Financial or real drivers? The answer is not always the same," International Journal of Forecasting, Elsevier, vol. 40(2), pages 762-776.
    18. Jian Guo & Elizaveta Levina & George Michailidis & Ji Zhu, 2010. "Pairwise Variable Selection for High-Dimensional Model-Based Clustering," Biometrics, The International Biometric Society, vol. 66(3), pages 793-804, September.
    19. Alexandre Belloni & Victor Chernozhukov & Ivan Fernandez-Val & Christian Hansen, 2013. "Program evaluation with high-dimensional data," CeMMAP working papers CWP77/13, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    20. Lu Tang & Ling Zhou & Peter X. K. Song, 2019. "Fusion learning algorithm to combine partially heterogeneous Cox models," Computational Statistics, Springer, vol. 34(1), pages 395-414, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:192:y:2024:i:c:s0167947323002128. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.