IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i3p357-d732834.html
   My bibliography  Save this article

A Distributed Optimization Accelerated Algorithm with Uncoordinated Time-Varying Step-Sizes in an Undirected Network

Author

Listed:
  • Yunshan Lü

    (Database and Artificial Intelligence Laboratory, College of Computer and Information Science, Southwest University, Chongqing 400715, China
    College of Big Data and Software, Chongqing College of Mobile Communication, Chongqing 401520, China)

  • Hailing Xiong

    (Database and Artificial Intelligence Laboratory, College of Computer and Information Science, Southwest University, Chongqing 400715, China
    Business College, Southwest University, Chongqing 402460, China)

  • Hao Zhou

    (Database and Artificial Intelligence Laboratory, College of Computer and Information Science, Southwest University, Chongqing 400715, China)

  • Xin Guan

    (Database and Artificial Intelligence Laboratory, College of Computer and Information Science, Southwest University, Chongqing 400715, China)

Abstract

In recent years, significant progress has been made in the field of distributed optimization algorithms. This study focused on the distributed convex optimization problem over an undirected network. The target was to minimize the average of all local objective functions known by each agent while each agent communicates necessary information only with its neighbors. Based on the state-of-the-art algorithm, we proposed a novel distributed optimization algorithm, when the objective function of each agent satisfies smoothness and strong convexity. Faster convergence can be attained by utilizing Nesterov and Heavy-ball accelerated methods simultaneously, making the algorithm widely applicable to many large-scale distributed tasks. Meanwhile, the step-sizes and accelerated momentum coefficients are designed as uncoordinate, time-varying, and nonidentical, which can make the algorithm adapt to a wide range of application scenarios. Under some necessary assumptions and conditions, through rigorous theoretical analysis, a linear convergence rate was achieved. Finally, the numerical experiments over a real dataset demonstrate the superiority and efficacy of the novel algorithm compared to similar algorithms.

Suggested Citation

  • Yunshan Lü & Hailing Xiong & Hao Zhou & Xin Guan, 2022. "A Distributed Optimization Accelerated Algorithm with Uncoordinated Time-Varying Step-Sizes in an Undirected Network," Mathematics, MDPI, vol. 10(3), pages 1-17, January.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:3:p:357-:d:732834
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/3/357/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/3/357/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Zhaojuan Zhang & Wanliang Wang & Gaofeng Pan, 2020. "A Distributed Quantum-Behaved Particle Swarm Optimization Using Opposition-Based Learning on Spark for Large-Scale Optimization Problem," Mathematics, MDPI, vol. 8(11), pages 1-21, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.

      Corrections

      All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:3:p:357-:d:732834. See general information about how to correct material in RePEc.

      If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

      If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

      If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

      For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

      Please note that corrections may take a couple of weeks to filter through the various RePEc services.

      IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.