Skip to yearly menu bar Skip to main content


FedAS: Bridging Inconsistency in Personalized Federated Learning

Xiyuan Yang · Wenke Huang · Mang Ye

Arch 4A-E Poster #229
[ ] [ Project Page ]
Thu 20 Jun 10:30 a.m. PDT — noon PDT


Personalized Federated Learning (PFL) is primarily designed to provide customized models for each client to better fit the non-iid distributed client data, which is a inherent challenge in Federated Learning.However, current PFL methods suffer from inconsistencies in both intra-client and inter-client levels: 1) The intra-client inconsistency stems from the asynchronous update strategy for personalized and shared parameters in PFL. Client updates their shared parameters to communicate and learn from other clients, while keeping personalized parts unchanged, leading to poor coordination between these two components.2) The Inter-client inconsistency in PFL arises from “stragglers" - inactive clients that communicate and train with the server less frequently. This results in their under-trained personalized models and impedes the collaborative training stage for other clients.In this paper, we present a novel PFL framework named \textbf{FedAS}, which uses \textbf{Fed}erated Parameter-\textbf{A}lignment and Client-\textbf{S}ynchronization to overcome above challenges. Initially, we enhance the localization of global parameters by infusing them with local insights, thereby increasing their relevance and consistency with local information.To achieve this, we make the shared parts learn from previous model, and thereby reduce the impact from parameter inconsistency.Furthermore, we design a robust aggregation method to mitigate the impact of stragglers by preventing the incorporation of their under-trained knowledge into global aggregated model.Experimental results on Cifar10 and Cifar100 datasets validate the effectiveness of our FedAS in achieving better personalization performance and robustness against data heterogeneity.

Live content is unavailable. Log in and register to view live content