IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v160y2014i3d10.1007_s10957-013-0409-2.html
   My bibliography  Save this article

Incrementally Updated Gradient Methods for Constrained and Regularized Optimization

Author

Listed:
  • Paul Tseng

    (University of Washington)

  • Sangwoon Yun

    (Sungkyunkwan University)

Abstract

We consider incrementally updated gradient methods for minimizing the sum of smooth functions and a convex function. This method can use a (sufficiently small) constant stepsize or, more practically, an adaptive stepsize that is decreased whenever sufficient progress is not made. We show that if the gradients of the smooth functions are Lipschitz continuous on the space of n-dimensional real column vectors or the gradients of the smooth functions are bounded and Lipschitz continuous over a certain level set and the convex function is Lipschitz continuous on its domain, then every cluster point of the iterates generated by the method is a stationary point. If in addition a local Lipschitz error bound assumption holds, then the method is linearly convergent.

Suggested Citation

  • Paul Tseng & Sangwoon Yun, 2014. "Incrementally Updated Gradient Methods for Constrained and Regularized Optimization," Journal of Optimization Theory and Applications, Springer, vol. 160(3), pages 832-853, March.
  • Handle: RePEc:spr:joptap:v:160:y:2014:i:3:d:10.1007_s10957-013-0409-2
    DOI: 10.1007/s10957-013-0409-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-013-0409-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-013-0409-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. P. Tseng & S. Yun, 2009. "Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization," Journal of Optimization Theory and Applications, Springer, vol. 140(3), pages 513-535, March.
    2. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sangho Kum & Sangwoon Yun, 2017. "Incremental Gradient Method for Karcher Mean on Symmetric Cones," Journal of Optimization Theory and Applications, Springer, vol. 172(1), pages 141-155, January.
    2. Paul Armand & Ngoc Nguyen Tran, 2021. "Local Convergence Analysis of a Primal–Dual Method for Bound-Constrained Optimization Without SOSC," Journal of Optimization Theory and Applications, Springer, vol. 189(1), pages 96-116, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    2. Jeon, Jong-June & Kim, Yongdai & Won, Sungho & Choi, Hosik, 2020. "Primal path algorithm for compositional data analysis," Computational Statistics & Data Analysis, Elsevier, vol. 148(C).
    3. Eric P Xing & Ross E Curtis & Georg Schoenherr & Seunghak Lee & Junming Yin & Kriti Puniyani & Wei Wu & Peter Kinnaird, 2014. "GWAS in a Box: Statistical and Visual Analytics of Structured Associations via GenAMap," PLOS ONE, Public Library of Science, vol. 9(6), pages 1-19, June.
    4. Abhik Ghosh & Magne Thoresen, 2018. "Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 102(2), pages 179-210, April.
    5. Jan Pablo Burgard & Joscha Krause & Dennis Kreber & Domingo Morales, 2021. "The generalized equivalence of regularization and min–max robustification in linear mixed models," Statistical Papers, Springer, vol. 62(6), pages 2857-2883, December.
    6. R. Lopes & S. A. Santos & P. J. S. Silva, 2019. "Accelerating block coordinate descent methods with identification strategies," Computational Optimization and Applications, Springer, vol. 72(3), pages 609-640, April.
    7. Mingyi Hong & Tsung-Hui Chang & Xiangfeng Wang & Meisam Razaviyayn & Shiqian Ma & Zhi-Quan Luo, 2020. "A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization," Mathematics of Operations Research, INFORMS, vol. 45(3), pages 833-861, August.
    8. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    9. Ernesto Carrella & Richard M. Bailey & Jens Koed Madsen, 2018. "Indirect inference through prediction," Papers 1807.01579, arXiv.org.
    10. Rui Wang & Naihua Xiu & Kim-Chuan Toh, 2021. "Subspace quadratic regularization method for group sparse multinomial logistic regression," Computational Optimization and Applications, Springer, vol. 79(3), pages 531-559, July.
    11. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    12. Masakazu Higuchi & Mitsuteru Nakamura & Shuji Shinohara & Yasuhiro Omiya & Takeshi Takano & Daisuke Mizuguchi & Noriaki Sonota & Hiroyuki Toda & Taku Saito & Mirai So & Eiji Takayama & Hiroo Terashi &, 2022. "Detection of Major Depressive Disorder Based on a Combination of Voice Features: An Exploratory Approach," IJERPH, MDPI, vol. 19(18), pages 1-13, September.
    13. Min Tao & Jiang-Ning Li, 2023. "Error Bound and Isocost Imply Linear Convergence of DCA-Based Algorithms to D-Stationarity," Journal of Optimization Theory and Applications, Springer, vol. 197(1), pages 205-232, April.
    14. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    15. Vincent, Martin & Hansen, Niels Richard, 2014. "Sparse group lasso and high dimensional multinomial classification," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 771-786.
    16. Chen, Le-Yu & Lee, Sokbae, 2018. "Best subset binary prediction," Journal of Econometrics, Elsevier, vol. 206(1), pages 39-56.
    17. Perrot-Dockès Marie & Lévy-Leduc Céline & Chiquet Julien & Sansonnet Laure & Brégère Margaux & Étienne Marie-Pierre & Robin Stéphane & Genta-Jouve Grégory, 2018. "A variable selection approach in the multivariate linear model: an application to LC-MS metabolomics data," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 17(5), pages 1-14, October.
    18. Fan, Jianqing & Jiang, Bai & Sun, Qiang, 2022. "Bayesian factor-adjusted sparse regression," Journal of Econometrics, Elsevier, vol. 230(1), pages 3-19.
    19. Chuliá, Helena & Garrón, Ignacio & Uribe, Jorge M., 2024. "Daily growth at risk: Financial or real drivers? The answer is not always the same," International Journal of Forecasting, Elsevier, vol. 40(2), pages 762-776.
    20. Jun Li & Serguei Netessine & Sergei Koulayev, 2018. "Price to Compete … with Many: How to Identify Price Competition in High-Dimensional Space," Management Science, INFORMS, vol. 64(9), pages 4118-4136, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:160:y:2014:i:3:d:10.1007_s10957-013-0409-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.