FedAlign: Differentially Private Distribution Alignment for Non-IID Federated Learning
Peng Wu ⋅ Jiapeng Zhang ⋅ Yingjie Song ⋅ Xiong Xiao ⋅ Zhuo Tang
Abstract
Federated Learning (FL) enables collaborative model training without sharing raw data, but client data are often Non-Independent and Identically Distributed (Non-IID), which often slow convergence and degrade global performance. Meanwhile, privacy preservation is also a critical concern in FL. To address these two issues, we propose $\textit{FedAlign}$, a differentially private framework that aligns local data distributions via client-side statistical moment alignment. Clients upload perturbed distribution statistics, which the server aggregates to infer global distribution characteristics and guide local alignment, thereby reducing inter-client discrepancies. Experiments and theoretical analysis show that FedAlign accelerates convergence and improves accuracy under Non-IID settings while preserving rigorous privacy guarantees.
Successful Page Load