IDEAS home Printed from https://ideas.repec.org/a/spr/jglopt/v90y2024i1d10.1007_s10898-024-01396-y.html
   My bibliography  Save this article

Subspace Newton method for sparse group $$\ell _0$$ ℓ 0 optimization problem

Author

Listed:
  • Shichen Liao

    (University of Chinese Academy of Sciences)

  • Congying Han

    (University of Chinese Academy of Sciences)

  • Tiande Guo

    (University of Chinese Academy of Sciences)

  • Bonan Li

    (University of Chinese Academy of Sciences)

Abstract

This paper investigates sparse optimization problems characterized by a sparse group structure, where element- and group-level sparsity are jointly taken into account. This particular optimization model has exhibited notable efficacy in tasks such as feature selection, parameter estimation, and the advancement of model interpretability. Central to our study is the scrutiny of the $$\ell _0$$ ℓ 0 and $$\ell _{2,0}$$ ℓ 2 , 0 norm regularization model, which, in comparison to alternative surrogate formulations, presents formidable computational challenges. We embark on our study by conducting the analysis of the optimality conditions of the sparse group optimization problem, leveraging the notion of a $$\gamma $$ γ -stationary point, whose linkage to local and global minimizer is established. In a subsequent facet of our study, we develop a novel subspace Newton algorithm for sparse group $$\ell _0$$ ℓ 0 optimization problem and prove its global convergence property as well as local second-order convergence rate. Experimental results reveal the superlative performance of our algorithm in terms of both precision and computational expediency, thereby outperforming several state-of-the-art solvers.

Suggested Citation

  • Shichen Liao & Congying Han & Tiande Guo & Bonan Li, 2024. "Subspace Newton method for sparse group $$\ell _0$$ ℓ 0 optimization problem," Journal of Global Optimization, Springer, vol. 90(1), pages 93-125, September.
  • Handle: RePEc:spr:jglopt:v:90:y:2024:i:1:d:10.1007_s10898-024-01396-y
    DOI: 10.1007/s10898-024-01396-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10898-024-01396-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10898-024-01396-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Wanyou Cheng & Zixin Chen & Qingjie Hu, 2020. "An active set Barzilar–Borwein algorithm for $$l_{0}$$l0 regularized optimization," Journal of Global Optimization, Springer, vol. 76(4), pages 769-791, April.
    2. Xiaotong Shen & Wei Pan & Yunzhang Zhu & Hui Zhou, 2013. "On constrained and regularized high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(5), pages 807-832, October.
    3. Yanming Li & Bin Nan & Ji Zhu, 2015. "Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure," Biometrics, The International Biometric Society, vol. 71(2), pages 354-363, June.
    4. Jingnan Chen & Gengling Dai & Ning Zhang, 2020. "An application of sparse-group lasso regularization to equity portfolio optimization and sector selection," Annals of Operations Research, Springer, vol. 284(1), pages 243-262, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chen, Yang & Luo, Ziyan & Kong, Lingchen, 2021. "ℓ2,0-norm based selection and estimation for multivariate generalized linear models," Journal of Multivariate Analysis, Elsevier, vol. 185(C).
    2. Sauvenier, Mathieu & Van Bellegem, Sébastien, 2023. "Direction Identification and Minimax Estimation by Generalized Eigenvalue Problem in High Dimensional Sparse Regression," LIDAM Discussion Papers CORE 2023005, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Wang, Yihe & Zhao, Sihai Dave, 2021. "A nonparametric empirical Bayes approach to large-scale multivariate regression," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    4. Seokhyun Chung & Raed Al Kontar & Zhenke Wu, 2022. "Weakly Supervised Multi-output Regression via Correlated Gaussian Processes," INFORMS Joural on Data Science, INFORMS, vol. 1(2), pages 115-137, October.
    5. Yuezhang Che & Shuyan Chen & Xin Liu, 2022. "Sparse Index Tracking Portfolio with Sector Neutrality," Mathematics, MDPI, vol. 10(15), pages 1-22, July.
    6. Wentao Wang & Jiaxuan Liang & Rong Liu & Yunquan Song & Min Zhang, 2022. "A Robust Variable Selection Method for Sparse Online Regression via the Elastic Net Penalty," Mathematics, MDPI, vol. 10(16), pages 1-18, August.
    7. Ni, Xuanming & Zheng, Tiantian & Zhao, Huimin & Zhu, Shushang, 2023. "High-dimensional portfolio optimization based on tree-structured factor model," Pacific-Basin Finance Journal, Elsevier, vol. 81(C).
    8. Wenxing Zhu & Huating Huang & Lanfan Jiang & Jianli Chen, 0. "Weighted thresholding homotopy method for sparsity constrained optimization," Journal of Combinatorial Optimization, Springer, vol. 0, pages 1-29.
    9. Ben-Ameur, Walid & Neto, José, 2022. "New bounds for subset selection from conic relaxations," European Journal of Operational Research, Elsevier, vol. 298(2), pages 425-438.
    10. Leonardo Di Gangi & M. Lapucci & F. Schoen & A. Sortino, 2019. "An efficient optimization approach for best subset selection in linear regression, with application to model selection and fitting in autoregressive time-series," Computational Optimization and Applications, Springer, vol. 74(3), pages 919-948, December.
    11. Srinivasan, Arun & Xue, Lingzhou & Zhan, Xiang, 2023. "Identification of microbial features in multivariate regression under false discovery rate control," Computational Statistics & Data Analysis, Elsevier, vol. 181(C).
    12. Dai, Linlin & Chen, Kani & Sun, Zhihua & Liu, Zhenqiu & Li, Gang, 2018. "Broken adaptive ridge regression and its asymptotic properties," Journal of Multivariate Analysis, Elsevier, vol. 168(C), pages 334-351.
    13. Yu-Zhu Tian & Man-Lai Tang & Mao-Zai Tian, 2021. "Bayesian joint inference for multivariate quantile regression model with L $$_{1/2}$$ 1 / 2 penalty," Computational Statistics, Springer, vol. 36(4), pages 2967-2994, December.
    14. Chen, Ying & Koch, Thorsten & Zakiyeva, Nazgul & Zhu, Bangzhu, 2020. "Modeling and forecasting the dynamics of the natural gas transmission network in Germany with the demand and supply balance constraint," Applied Energy, Elsevier, vol. 278(C).
    15. Yang, Yuehan & Xia, Siwei & Yang, Hu, 2023. "Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures," Computational Statistics & Data Analysis, Elsevier, vol. 178(C).
    16. Bai, Ray & Ghosh, Malay, 2018. "High-dimensional multivariate posterior consistency under global–local shrinkage priors," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 157-170.
    17. He, Xin & Mao, Xiaojun & Wang, Zhonglei, 2024. "Nonparametric augmented probability weighting with sparsity," Computational Statistics & Data Analysis, Elsevier, vol. 191(C).
    18. Park, Seyoung & Kim, Hyunjin & Lee, Eun Ryung, 2023. "Regional quantile regression for multiple responses," Computational Statistics & Data Analysis, Elsevier, vol. 188(C).
    19. Zhi Zhao & Manuela Zucknick, 2020. "Structured penalized regression for drug sensitivity prediction," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 69(3), pages 525-545, June.
    20. Enrico Civitelli & Matteo Lapucci & Fabio Schoen & Alessio Sortino, 2021. "An effective procedure for feature subset selection in logistic regression based on information criteria," Computational Optimization and Applications, Springer, vol. 80(1), pages 1-32, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jglopt:v:90:y:2024:i:1:d:10.1007_s10898-024-01396-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.