IDEAS home Printed from https://ideas.repec.org/r/cor/louvrp/2701.html
   My bibliography  Save this item

Universal gradient methods for convex optimization problems

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Elena Tovbis & Vladimir Krutikov & Lev Kazakovtsev, 2024. "Newtonian Property of Subgradient Method with Optimization of Metric Matrix Parameter Correction," Mathematics, MDPI, vol. 12(11), pages 1-27, May.
  2. Pham Duy Khanh & Boris S. Mordukhovich & Dat Ba Tran, 2024. "Inexact Reduced Gradient Methods in Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2138-2178, December.
  3. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
  4. Chin Pang Ho & Panos Parpas, 2019. "Empirical risk minimization: probabilistic complexity and stepsize strategy," Computational Optimization and Applications, Springer, vol. 73(2), pages 387-410, June.
  5. Meruza Kubentayeva & Demyan Yarmoshik & Mikhail Persiianov & Alexey Kroshnin & Ekaterina Kotliarova & Nazarii Tupitsa & Dmitry Pasechnyuk & Alexander Gasnikov & Vladimir Shvetsov & Leonid Baryshev & A, 2024. "Primal-dual gradient methods for searching network equilibria in combined models with nested choice structure and capacity constraints," Computational Management Science, Springer, vol. 21(1), pages 1-33, June.
  6. Vladimir Krutikov & Svetlana Gutova & Elena Tovbis & Lev Kazakovtsev & Eugene Semenkin, 2022. "Relaxation Subgradient Algorithms with Machine Learning Procedures," Mathematics, MDPI, vol. 10(21), pages 1-33, October.
  7. Anton Rodomanov & Yurii Nesterov, 2020. "Smoothness Parameter of Power of Euclidean Norm," Journal of Optimization Theory and Applications, Springer, vol. 185(2), pages 303-326, May.
  8. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
  9. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
  10. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
  11. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
  12. Meruza Kubentayeva & Alexander Gasnikov, 2021. "Finding Equilibria in the Traffic Assignment Problem with Primal-Dual Gradient Methods for Stable Dynamics Model and Beckmann Model," Mathematics, MDPI, vol. 9(11), pages 1-17, May.
  13. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
  14. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
  15. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
  16. Alkousa, Mohammad & Stonyakin, Fedor & Gasnikov, Alexander & Abdo, Asmaa & Alcheikh, Mohammad, 2024. "Higher degree inexact model for optimization problems," Chaos, Solitons & Fractals, Elsevier, vol. 186(C).
  17. Filip Hanzely & Peter Richtárik & Lin Xiao, 2021. "Accelerated Bregman proximal gradient methods for relatively smooth convex optimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 405-440, June.
  18. Benjamin Grimmer, 2023. "General Hölder Smooth Convergence Rates Follow from Specialized Rates Assuming Growth Bounds," Journal of Optimization Theory and Applications, Springer, vol. 197(1), pages 51-70, April.
  19. Hu, Yaohua & Li, Gongnong & Yu, Carisa Kwok Wai & Yip, Tsz Leung, 2022. "Quasi-convex feasibility problems: Subgradient methods and convergence rates," European Journal of Operational Research, Elsevier, vol. 298(1), pages 45-58.
  20. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
  21. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
  22. Eduard Gorbunov & Marina Danilova & Innokentiy Shibaev & Pavel Dvurechensky & Alexander Gasnikov, 2024. "High-Probability Complexity Bounds for Non-smooth Stochastic Convex Optimization with Heavy-Tailed Noise," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2679-2738, December.
  23. Vladimir Krutikov & Elena Tovbis & Svetlana Gutova & Ivan Rozhnov & Lev Kazakovtsev, 2024. "Gradient Method with Step Adaptation," Mathematics, MDPI, vol. 13(1), pages 1-35, December.
  24. Stefania Bellavia & Gianmarco Gurioli & Benedetta Morini & Philippe Louis Toint, 2023. "The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case," Journal of Optimization Theory and Applications, Springer, vol. 196(2), pages 700-729, February.
  25. Masaru Ito & Mituhiro Fukuda, 2021. "Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 770-804, March.
  26. Guillaume O. Berger & P.-A. Absil & Raphaël M. Jungers & Yurii Nesterov, 2020. "On the Quality of First-Order Approximation of Functions with Hölder Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 185(1), pages 17-33, April.
IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.