Treffer: Dominating Set Model Aggregation for communication-efficient decentralized deep learning

Title:
Dominating Set Model Aggregation for communication-efficient decentralized deep learning
Contributors:
Mechanical Engineering, Department of Computer Science
Publisher Information:
Elsevier
Publication Year:
2023
Collection:
Digital Repository @ Iowa State University
Document Type:
Fachzeitschrift article in journal/newspaper
File Description:
application/pdf
Language:
English
Rights:
This version of record is licensed as CC BY-NC-ND.
Accession Number:
edsbas.BBA3E7C9
Database:
BASE

Weitere Informationen

Decentralized deep learning algorithms leverage peer-to-peer communication of model parameters and/or gradients over communication graphs among the learning agents with access to their private data sets. The majority of the studies in this area focus on achieving high accuracy, with many at the expense of increased communication overhead among the agents. However, large peer-to-peer communication overhead often becomes a practical challenge, especially in harsh environments such as for an underwater sensor network. In this paper, we aim to reduce communication overhead while achieving similar performance as the state-of-the-art algorithms. To achieve this, we use the concept of Minimum Connected Dominating Set from graph theory that is applied in ad hoc wireless networks to address communication overhead issues. Specifically, we propose a new decentralized deep learning algorithm called minimum connected Dominating Set Model Aggregation (DSMA). We investigate the efficacy of our method for different communication graph topologies with a small to large number of agents using varied neural network model architectures. Empirical results on benchmark data sets show a significant (up to 100X) reduction in communication time while preserving the accuracy or in some cases, increasing it compared to the state-of-the-art methods. We also present an analysis to show the convergence of our proposed algorithm. ; This is a version of record published as Fotouhi, Fateme, Aditya Balu, Zhanhong Jiang, Yasaman Esfandiari, Salman Jahani, and Soumik Sarkar. "Dominating Set Model Aggregation for communication-efficient decentralized deep learning." Neural Networks 171 (2024): 25-39. doi: https://doi.org/10.1016/j.neunet.2023.11.057.