Skip to yearly menu bar Skip to main content


Poster

FedHARM: Harmonizing Model Architectural Diversity in Federated Learning

Anestis Kastellos · Athanasios Psaltis · Charalampos Z Patrikakis · Petros Daras

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Fri 4 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

In the domain of Federated Learning (FL), the issue of managing variability in model architectures surpasses a mere technical barrier, representing a crucial aspect of the field's evolution, especially considering the ever-increasing number of model architectures emerging in the literature. This focus on architecture variability emerges from the unique nature of FL, where diverse devices or participants, each with their own data and computational constraints, collaboratively train a shared model. The proposed FL system architecture facilitates the deployment of diverse convolutional neural network (CNN) architectures across distinct clients, while outperforming the state-of-the-art FL methodologies. FedHARM capitalizes on the strengths of different architectures while limiting their weaknesses by converging each local client on a shared dataset to achieve superior performance on the test set.

Live content is unavailable. Log in and register to view live content