Skip to yearly menu bar Skip to main content


Poster

Subspace Constraint and Contribution Estimation for Heterogeneous Federated Learning

Xiangtao Zhang · Sheng Li · Ao Li · Yipeng Liu · Fan Zhang · Ce Zhu · Le Zhang


Abstract: Heterogeneous Federated Learning (HFL) has received widespread attention due to its adaptability to different models and data. The HFL approach utilizing auxiliary models for knowledge transfer enhances flexibility. However, existing frameworks face the challenges of aggregation bias and local overfitting. To address these issues, we propose FedSCE. It reduces the degree of freedom of update and improves generalization performance by limiting the specific layer of local model update to the local subspace. The subspace is dynamically updated to ensure coverage of the latest model update trajectory. Additionally, FedSCE evaluates client contributions based on the update distance of the auxiliary model in feature space and parameter space, achieving adaptive weighted aggregation. We validate our approach in both feature-skewed and label-skewed scenarios, demonstrating that on the Office10, our method exceeds the best baseline by 3.87. Our source code will be released.

Live content is unavailable. Log in and register to view live content