Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

AFL: A Single-Round Analytic Approach for Federated Learning with Pre-trained Models

Run He · Kai Tong · Di Fang · Han Sun · Ziqian Zeng · Haoran Li · Tianyi Chen · Huiping Zhuang


Abstract: In this paper, we introduce analytic federated learning (AFL), a new training paradigm that brings analytical (i.e., closed-form) solutions to the federated learning (FL) with pre-trained models. Our AFL draws inspiration from analytic learning---a gradient-free technique that trains neural networks with analytical solutions in one epoch. In the local client training stage, the AFL facilitates a one-epoch training, eliminating the necessity for multi-epoch updates. In the aggregation stage, we derive an absolute aggregation (AA) law. This AA law allows a single-round aggregation, reducing heavy communication overhead and achieving fast convergence by removing the need for multiple aggregation rounds. More importantly, the AFL exhibits a property that invariance to data partitioning, meaning that regardless of how the full dataset is distributed among clients, the aggregated result remains identical. This could spawn various potentials, such as data heterogeneity invariance and client-number invariance. We conduct experiments across various FL settings including extremely non-IID ones, and scenarios with a large number of clients (e.g., 1000). In all these settings, our AFL constantly performs competitively while existing FL techniques encounter various obstacles.

Live content is unavailable. Log in and register to view live content