Author
Listed:
- Xiaofeng Xue
(Information Countermeauser Technique Institute, School of Cyberspace Science, Faculty of Computing, Harbin Institute of Technology, Harbin 150001, China)
- Haokun Mao
(Information Countermeauser Technique Institute, School of Cyberspace Science, Faculty of Computing, Harbin Institute of Technology, Harbin 150001, China)
- Qiong Li
(Information Countermeauser Technique Institute, School of Cyberspace Science, Faculty of Computing, Harbin Institute of Technology, Harbin 150001, China)
- Furong Huang
(School of International Studies, Harbin Institute of Technology, Harbin 150001, China)
- Ahmed A. Abd El-Latif
(EIAS Data Science Lab, College of Computer and Information Sciences, Prince Sultan University, Riyadh 11586, Saudi Arabia
Department of Mathematics and Computer Science, Faculty of Science, Menoufia University, Menofia 32511, Egypt)
Abstract
Specializing Directed Acyclic Graph Federated Learning (SDAGFL) is a new federated learning framework with the advantages of decentralization, personalization, resisting a single point of failure, and poisoning attack. Instead of training a single global model, the clients in SDAGFL update their models asynchronously from the devices with similar data distribution through Directed Acyclic Graph Distributed Ledger Technology (DAG-DLT), which is designed for IoT scenarios. Because of many the features inherited from DAG-DLT, SDAGFL is suitable for IoT scenarios in many aspects. However, the training process of SDAGFL is quite energy consuming, in which each client needs to compute the confidence and rating of the nodes selected by multiple random walks by traveling the ledger with 15–25 depth to obtain the “reference model” to judge whether or not to broadcast the newly trained model. As we know, the energy consumption is an important issue for IoT scenarios, as most devices are battery-powered with strict energy restrictions. To optimize SDAGFL for IoT, an energy-efficient SDAGFL based on an event-triggered communication mechanism, i.e., ESDAGFL, is proposed in this paper. In ESDAGFL, the new model is broadcasted only in the event that the new model is significantly different from the previous one, instead of traveling the ledger to search for the “reference model”. We evaluate the ESDAGFL on the FMNIST-clustered and Poets dataset. The simulation is performed on a platform with Intel ® Core TM i7-10700 CPU (CA, USA). The simulation results demonstrate that ESDAGFL can reach a balance between training accuracy and specialization as good as SDAGFL. What is more, ESDAGFL can reduce the energy consumption by 42.5% and 51.7% for the FMNIST-clustered and Poets datasets, respectively.
Suggested Citation
Xiaofeng Xue & Haokun Mao & Qiong Li & Furong Huang & Ahmed A. Abd El-Latif, 2022.
"An Energy Efficient Specializing DAG Federated Learning Based on Event-Triggered Communication,"
Mathematics, MDPI, vol. 10(22), pages 1-17, November.
Handle:
RePEc:gam:jmathe:v:10:y:2022:i:22:p:4388-:d:979452
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:22:p:4388-:d:979452. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.