IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v70y2018i2d10.1007_s10589-018-9984-3.html
   My bibliography  Save this article

A flexible coordinate descent method

Author

Listed:
  • Kimon Fountoulakis

    (University of California Berkeley)

  • Rachael Tappenden

    (University of Canterbury)

Abstract

We present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly nonseparable or ill conditioned problems. We call the method Flexible Coordinate Descent (FCD). At each iteration of FCD, a block of coordinates is sampled randomly, a quadratic model is formed about that block and the model is minimized approximately/inexactly to determine the search direction. An inexpensive line search is then employed to ensure a monotonic decrease in the objective function and acceptance of large step sizes. We present several high probability iteration complexity results to show that convergence of FCD is guaranteed theoretically. Finally, we present numerical results on large-scale problems to demonstrate the practical performance of the method.

Suggested Citation

  • Kimon Fountoulakis & Rachael Tappenden, 2018. "A flexible coordinate descent method," Computational Optimization and Applications, Springer, vol. 70(2), pages 351-394, June.
  • Handle: RePEc:spr:coopap:v:70:y:2018:i:2:d:10.1007_s10589-018-9984-3
    DOI: 10.1007/s10589-018-9984-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-018-9984-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-018-9984-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Paul Tseng & Sangwoon Yun, 2010. "A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training," Computational Optimization and Applications, Springer, vol. 47(2), pages 179-206, October.
    2. NESTEROV, Yurii, 2012. "Efficiency of coordinate descent methods on huge-scale optimization problems," LIDAM Reprints CORE 2511, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. P. Tseng & S. Yun, 2009. "Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization," Journal of Optimization Theory and Applications, Springer, vol. 140(3), pages 513-535, March.
    5. Ion Necoara & Andrei Patrascu, 2014. "A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints," Computational Optimization and Applications, Springer, vol. 57(2), pages 307-337, March.
    6. Cassioli, A. & Di Lorenzo, D. & Sciandrone, M., 2013. "On the convergence of inexact block coordinate descent methods for constrained optimization," European Journal of Operational Research, Elsevier, vol. 231(2), pages 274-281.
    7. Kimon Fountoulakis & Jacek Gondzio, 2016. "Performance of first- and second-order methods for $$\ell _1$$ ℓ 1 -regularized least squares problems," Computational Optimization and Applications, Springer, vol. 65(3), pages 605-635, December.
    8. Rachael Tappenden & Peter Richtárik & Jacek Gondzio, 2016. "Inexact Coordinate Descent: Complexity and Preconditioning," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 144-176, July.
    9. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2013. "Intermediate gradient methods for smooth convex problems with inexact oracle," LIDAM Discussion Papers CORE 2013017, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Christian Kanzow & Theresa Lechner, 2021. "Globalized inexact proximal Newton-type methods for nonconvex composite functions," Computational Optimization and Applications, Springer, vol. 78(2), pages 377-410, March.
    2. R. Lopes & S. A. Santos & P. J. S. Silva, 2019. "Accelerating block coordinate descent methods with identification strategies," Computational Optimization and Applications, Springer, vol. 72(3), pages 609-640, April.
    3. Bastian Pötzl & Anton Schiela & Patrick Jaap, 2022. "Second order semi-smooth Proximal Newton methods in Hilbert spaces," Computational Optimization and Applications, Springer, vol. 82(2), pages 465-498, June.
    4. Majid Jahani & Naga Venkata C. Gudapati & Chenxin Ma & Rachael Tappenden & Martin Takáč, 2021. "Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences," Computational Optimization and Applications, Springer, vol. 79(2), pages 369-404, June.
    5. Ching-pei Lee & Stephen J. Wright, 2020. "Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization," Journal of Optimization Theory and Applications, Springer, vol. 185(1), pages 151-187, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rachael Tappenden & Peter Richtárik & Jacek Gondzio, 2016. "Inexact Coordinate Descent: Complexity and Preconditioning," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 144-176, July.
    2. Ion Necoara & Andrei Patrascu, 2014. "A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints," Computational Optimization and Applications, Springer, vol. 57(2), pages 307-337, March.
    3. Sjur Didrik Flåm, 2019. "Blocks of coordinates, stochastic programming, and markets," Computational Management Science, Springer, vol. 16(1), pages 3-16, February.
    4. Ion Necoara & Yurii Nesterov & François Glineur, 2017. "Random Block Coordinate Descent Methods for Linearly Constrained Optimization over Networks," Journal of Optimization Theory and Applications, Springer, vol. 173(1), pages 227-254, April.
    5. Amir Beck, 2014. "The 2-Coordinate Descent Method for Solving Double-Sided Simplex Constrained Minimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 162(3), pages 892-919, September.
    6. Andrea Cristofari, 2019. "An almost cyclic 2-coordinate descent method for singly linearly constrained problems," Computational Optimization and Applications, Springer, vol. 73(2), pages 411-452, June.
    7. Ching-pei Lee & Stephen J. Wright, 2020. "Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization," Journal of Optimization Theory and Applications, Springer, vol. 185(1), pages 151-187, April.
    8. R. Lopes & S. A. Santos & P. J. S. Silva, 2019. "Accelerating block coordinate descent methods with identification strategies," Computational Optimization and Applications, Springer, vol. 72(3), pages 609-640, April.
    9. Mingyi Hong & Tsung-Hui Chang & Xiangfeng Wang & Meisam Razaviyayn & Shiqian Ma & Zhi-Quan Luo, 2020. "A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization," Mathematics of Operations Research, INFORMS, vol. 45(3), pages 833-861, August.
    10. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    11. S. Bonettini & M. Prato & S. Rebegoldi, 2018. "A block coordinate variable metric linesearch based proximal gradient method," Computational Optimization and Applications, Springer, vol. 71(1), pages 5-52, September.
    12. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    13. Andrei Patrascu & Ion Necoara, 2015. "Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization," Journal of Global Optimization, Springer, vol. 61(1), pages 19-46, January.
    14. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    15. Majid Jahani & Naga Venkata C. Gudapati & Chenxin Ma & Rachael Tappenden & Martin Takáč, 2021. "Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences," Computational Optimization and Applications, Springer, vol. 79(2), pages 369-404, June.
    16. Jin Zhang & Xide Zhu, 2022. "Linear Convergence of Prox-SVRG Method for Separable Non-smooth Convex Optimization Problems under Bounded Metric Subregularity," Journal of Optimization Theory and Applications, Springer, vol. 192(2), pages 564-597, February.
    17. Leonardo Galli & Alessandro Galligari & Marco Sciandrone, 2020. "A unified convergence framework for nonmonotone inexact decomposition methods," Computational Optimization and Applications, Springer, vol. 75(1), pages 113-144, January.
    18. Zhigang Li & Mingchuan Zhang & Junlong Zhu & Ruijuan Zheng & Qikun Zhang & Qingtao Wu, 2018. "Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization," Complexity, Hindawi, vol. 2018, pages 1-11, December.
    19. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
    20. Tao Sun & Yuejiao Sun & Yangyang Xu & Wotao Yin, 2020. "Markov chain block coordinate descent," Computational Optimization and Applications, Springer, vol. 75(1), pages 35-61, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:70:y:2018:i:2:d:10.1007_s10589-018-9984-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.