IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v147y2020ics0167947320300347.html
   My bibliography  Save this article

Separating variables to accelerate non-convex regularized optimization

Author

Listed:
  • Liu, Wenchen
  • Tang, Yincai
  • Wu, Xianyi

Abstract

In this paper, a novel variable separation algorithm stemmed from the idea of orthogonalization EM is proposed to find the minimization of general function with non-convex regularizer. The main idea of our algorithm is to construct a new function by adding an item that allows minimization to be solved separately on each component. Several attractive theoretical properties concerning the new algorithm are established. The new algorithm converges to one of the critical points with the condition that the objective function is coercive or the generated sequence is in a compact set. The convergence rate of the algorithm is also obtained. The Barzilai–Borwein (BB) rule and Nesterov’s method are also used to accelerate our algorithm. The new algorithm can also be used to solve the minimization of general function with group structure regularizer. The simulation and real data results show that these methods can accelerate our method obviously.

Suggested Citation

  • Liu, Wenchen & Tang, Yincai & Wu, Xianyi, 2020. "Separating variables to accelerate non-convex regularized optimization," Computational Statistics & Data Analysis, Elsevier, vol. 147(C).
  • Handle: RePEc:eee:csdana:v:147:y:2020:i:c:s0167947320300347
    DOI: 10.1016/j.csda.2020.106943
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947320300347
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2020.106943?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Dankmar Böhning & Bruce Lindsay, 1988. "Monotonicity of quadratic-approximation algorithms," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 40(4), pages 641-663, December.
    2. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    3. Mazumder, Rahul & Friedman, Jerome H. & Hastie, Trevor, 2011. "SparseNet: Coordinate Descent With Nonconvex Penalties," Journal of the American Statistical Association, American Statistical Association, vol. 106(495), pages 1125-1138.
    4. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Umberto Amato & Anestis Antoniadis & Italia Feis & Irène Gijbels, 2022. "Penalized wavelet estimation and robust denoising for irregular spaced data," Computational Statistics, Springer, vol. 37(4), pages 1621-1651, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jin, Shaobo & Moustaki, Irini & Yang-Wallentin, Fan, 2018. "Approximated penalized maximum likelihood for exploratory factor analysis: an orthogonal case," LSE Research Online Documents on Economics 88118, London School of Economics and Political Science, LSE Library.
    2. Shaobo Jin & Irini Moustaki & Fan Yang-Wallentin, 2018. "Approximated Penalized Maximum Likelihood for Exploratory Factor Analysis: An Orthogonal Case," Psychometrika, Springer;The Psychometric Society, vol. 83(3), pages 628-649, September.
    3. Runmin Shi & Faming Liang & Qifan Song & Ye Luo & Malay Ghosh, 2018. "A Blockwise Consistency Method for Parameter Estimation of Complex Models," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 80(1), pages 179-223, December.
    4. Siwei Xia & Yuehan Yang & Hu Yang, 2022. "Sparse Laplacian Shrinkage with the Graphical Lasso Estimator for Regression Problems," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 255-277, March.
    5. Hirose, Kei & Tateishi, Shohei & Konishi, Sadanori, 2013. "Tuning parameter selection in sparse regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 59(C), pages 28-40.
    6. Benjamin G. Stokell & Rajen D. Shah & Ryan J. Tibshirani, 2021. "Modelling high‐dimensional categorical data using nonconvex fusion penalties," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 579-611, July.
    7. Lee, Sangin & Kwon, Sunghoon & Kim, Yongdai, 2016. "A modified local quadratic approximation algorithm for penalized optimization problems," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 275-286.
    8. Piotr Pokarowski & Wojciech Rejchel & Agnieszka Sołtys & Michał Frej & Jan Mielniczuk, 2022. "Improving Lasso for model selection and prediction," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 49(2), pages 831-863, June.
    9. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    10. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    11. Fan, Jianqing & Jiang, Bai & Sun, Qiang, 2022. "Bayesian factor-adjusted sparse regression," Journal of Econometrics, Elsevier, vol. 230(1), pages 3-19.
    12. Hui Xiao & Yiguo Sun, 2020. "Forecasting the Returns of Cryptocurrency: A Model Averaging Approach," JRFM, MDPI, vol. 13(11), pages 1-15, November.
    13. Zhu Wang, 2022. "MM for penalized estimation," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 54-75, March.
    14. Naimoli, Antonio, 2022. "Modelling the persistence of Covid-19 positivity rate in Italy," Socio-Economic Planning Sciences, Elsevier, vol. 82(PA).
    15. Camila Epprecht & Dominique Guegan & Álvaro Veiga & Joel Correa da Rosa, 2017. "Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and Autometrics," Post-Print halshs-00917797, HAL.
    16. Zichen Zhang & Ye Eun Bae & Jonathan R. Bradley & Lang Wu & Chong Wu, 2022. "SUMMIT: An integrative approach for better transcriptomic data imputation improves causal gene identification," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    17. Bartosz Uniejewski, 2024. "Regularization for electricity price forecasting," Papers 2404.03968, arXiv.org.
    18. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
    19. Peter Martey Addo & Dominique Guegan & Bertrand Hassani, 2018. "Credit Risk Analysis Using Machine and Deep Learning Models," Risks, MDPI, vol. 6(2), pages 1-20, April.
    20. Capanu, Marinela & Giurcanu, Mihai & Begg, Colin B. & Gönen, Mithat, 2023. "Subsampling based variable selection for generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 184(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:147:y:2020:i:c:s0167947320300347. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.