Skip to yearly menu bar Skip to main content


Poster

Efficient and Light-Weight Federated Learning via Asynchronous Distributed Dropout

Chen Dun · Mirian Hipolito Garcia · Chris Jermaine · Dimitrios Dimitriadis · Anastasios Kyrillidis

Auditorium 1 Foyer 38

Abstract:

We focus on dropout techniques in asynchronous distributed computing for federated learning (FL) scenarios. We propose AsyncDrop, a novel asynchronous FL framework with smart (i.e., informed/structured) dropout. Overall, AsyncDrop achieves better performance compared to state of the art asynchronous methodologies, while resulting in less communication and training time overheads. The key idea revolves around creating "submodels'" out of the global model and distributing their training to workers based on device heterogeneity. We rigorously justify that such an approach can be theoretically characterized. We implement our approach and compare it against other asynchronous baselines, by adapting existing synchronous FL algorithms to asynchronous scenarios. Empirically, AsyncDrop reduces the communication cost and training time, while improving the final test accuracy in diverse non-i.i.d. FL scenarios.

Live content is unavailable. Log in and register to view live content