Coordinate descent methods beyond smoothness and separability
Author
Abstract
Suggested Citation
DOI: 10.1007/s10589-024-00556-w
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
References listed on IDEAS
- Haihao Lu & Robert M. Freund & Yurii Nesterov, 2018. "Relatively smooth convex optimization by first-order methods, and applications," LIDAM Reprints CORE 2965, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Yurii NESTEROV & Sebastian U. STICH, 2017. "Efficiency of the accelerated coordinate descent method on structured optimization problems," LIDAM Reprints CORE 2845, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Lorenzo Stella & Andreas Themelis & Panagiotis Patrinos, 2017. "Forward–backward quasi-Newton methods for nonsmooth optimization problems," Computational Optimization and Applications, Springer, vol. 67(3), pages 443-487, July.
- Ion Necoara & Yurii Nesterov & François Glineur, 2019. "Linear convergence of first order methods for non-strongly convex optimization," LIDAM Reprints CORE 3000, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Filip Hanzely & Peter Richtárik, 2021. "Fastest rates for stochastic mirror descent methods," Computational Optimization and Applications, Springer, vol. 79(3), pages 717-766, July.
- Olivier Fercoq & Zheng Qu, 2020. "Restarting the accelerated coordinate descent method with a rough strong convexity estimate," Computational Optimization and Applications, Springer, vol. 75(1), pages 63-91, January.
- NESTEROV, Yurii, 2012. "Efficiency of coordinate descent methods on huge-scale optimization problems," LIDAM Reprints CORE 2511, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Aviad Aberdam & Amir Beck, 2022. "An Accelerated Coordinate Gradient Descent Algorithm for Non-separable Composite Optimization," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 219-246, June.
- Pontus Giselsson & Mattias Fält, 2018. "Envelope Functions: Unifications and Further Properties," Journal of Optimization Theory and Applications, Springer, vol. 178(3), pages 673-698, September.
- Yurii NESTEROV & Vladimir SPOKOINY, 2017. "Random gradient-free minimization of convex functions," LIDAM Reprints CORE 2851, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Ghaderi, Susan & Ahookhosh, Masoud & Arany, Adam & Skupin, Alexander & Patrinos, Panagiotis & Moreau, Yves, 2024. "Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions," Applied Mathematics and Computation, Elsevier, vol. 464(C).
- Aviad Aberdam & Amir Beck, 2022. "An Accelerated Coordinate Gradient Descent Algorithm for Non-separable Composite Optimization," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 219-246, June.
- Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
- Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
- Yin Liu & Sam Davanloo Tajbakhsh, 2023. "Stochastic Composition Optimization of Functions Without Lipschitz Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 239-289, July.
- Leandro Farias Maia & David Huckleberry Gutman & Ryan Christopher Hughes, 2024. "The Inexact Cyclic Block Proximal Gradient Method and Properties of Inexact Proximal Maps," Journal of Optimization Theory and Applications, Springer, vol. 201(2), pages 668-698, May.
- Pourya Behmandpoor & Puya Latafat & Andreas Themelis & Marc Moonen & Panagiotis Patrinos, 2024. "SPIRAL: a superlinearly convergent incremental proximal algorithm for nonconvex finite sum minimization," Computational Optimization and Applications, Springer, vol. 88(1), pages 71-106, May.
- Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
- Abbaszadehpeivasti, Hadi, 2024. "Performance analysis of optimization methods for machine learning," Other publications TiSEM 3050a62d-1a1f-494e-99ef-7, Tilburg University, School of Economics and Management.
- Yanli Liu & Wotao Yin, 2019. "An Envelope for Davis–Yin Splitting and Strict Saddle-Point Avoidance," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 567-587, May.
- Zamani, Moslem & Abbaszadehpeivasti, Hadi & de Klerk, Etienne, 2024. "The exact worst-case convergence rate of the alternating direction method of multipliers," Other publications TiSEM f30ae9e6-ed19-423f-bd1e-0, Tilburg University, School of Economics and Management.
- David Kozak & Stephen Becker & Alireza Doostan & Luis Tenorio, 2021. "A stochastic subspace approach to gradient-free optimization in high dimensions," Computational Optimization and Applications, Springer, vol. 79(2), pages 339-368, June.
- Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018.
"Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization,"
Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
- Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact worst-case convergence rates of the proximal gradient method for composite convex minimization," LIDAM Reprints CORE 2975, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Olivier Fercoq & Zheng Qu, 2020. "Restarting the accelerated coordinate descent method with a rough strong convexity estimate," Computational Optimization and Applications, Springer, vol. 75(1), pages 63-91, January.
- Hui Zhang & Yu-Hong Dai & Lei Guo & Wei Peng, 2021. "Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions," Mathematics of Operations Research, INFORMS, vol. 46(1), pages 61-81, February.
- Anastasiya Ivanova & Pavel Dvurechensky & Evgeniya Vorontsova & Dmitry Pasechnyuk & Alexander Gasnikov & Darina Dvinskikh & Alexander Tyurin, 2022. "Oracle Complexity Separation in Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 462-490, June.
- V. Kungurtsev & F. Rinaldi, 2021. "A zeroth order method for stochastic weakly convex optimization," Computational Optimization and Applications, Springer, vol. 80(3), pages 731-753, December.
- Hoang Tran & Qiang Du & Guannan Zhang, 2025. "Convergence analysis for a nonlocal gradient descent method via directional Gaussian smoothing," Computational Optimization and Applications, Springer, vol. 90(2), pages 481-513, March.
- TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016.
"Exact worst-case performance of first-order methods for composite convex optimization,"
LIDAM Discussion Papers CORE
2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Adrien B. TAYLOR & Julien M. HENDRICKX & François GLINEUR, 2017. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Reprints CORE 2875, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Andrej Čopar & Blaž Zupan & Marinka Zitnik, 2019. "Fast optimization of non-negative matrix tri-factorization," PLOS ONE, Public Library of Science, vol. 14(6), pages 1-15, June.
More about this item
Keywords
Convex optimization; Growth condition; Nonsmooth and nonseparable objective; Coordinate descent; Convergence analysis;All these keywords.
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:88:y:2024:i:1:d:10.1007_s10589-024-00556-w. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.