Communication and Energy Efficient Slimmable Federated Learning via Superposition Coding and Successive Decoding. (arXiv:2112.03267v1 [cs.LG])

Mobile devices are indispensable sources of big data. Federated learning (FL)
has a great potential in exploiting these private data by exchanging locally
trained models instead of their raw data. However, mobile devices are often
energy limited and wirelessly connected, and FL cannot cope flexibly with their
heterogeneous and time-varying energy capacity and communication throughput,
limiting the adoption. Motivated by these issues, we propose a novel energy and
communication efficient FL framework, coined SlimFL. To resolve the
heterogeneous energy capacity problem, each device in SlimFL runs a
width-adjustable slimmable neural network (SNN). To address the heterogeneous
communication throughput problem, each full-width (1.0x) SNN model and its
half-width ($0.5$x) model are superposition-coded before transmission, and
successively decoded after reception as the 0.5x or $1.0$x model depending on
the channel quality. Simulation results show that SlimFL can simultaneously
train both $0.5$x and $1.0$x models with reasonable accuracy and convergence
speed, compared to its vanilla FL counterpart separately training the two
models using $2$x more communication resources. Surprisingly, SlimFL achieves
even higher accuracy with lower energy footprints than vanilla FL for poor
channels and non-IID data distributions, under which vanilla FL converges
slowly.

Source: https://arxiv.org/abs/2112.03267

webmaster

Related post