FedHarmony: Harmonizing Heterogeneous Label Correlations in Federated Multi-Label Learning
Abstract
Multi-label representations encode higher-order label dependencies, yet in federated settings the local estimates of these dependencies are statistically inconsistent, causing structural drift across clients and rendering naive quantity-weighted aggregation suboptimal. We propose FedHarmony, a federated multi-label learning framework that harmonizes heterogeneous label correlations without sharing raw data. A Correlation Expert is formed by leave-one-out consolidation of clients’ label–label correlation statistics to provide a round-wise global consensus. Guided by this expert, each client performs consensus-guided correction that aligns its local correlation to the consensus within clusters of strongly related labels obtained via spectral clustering of the expert matrix. This block-wise alignment targets dense, high-signal subspaces. We establish two guarantees: (i) restricting alignment to in-cluster pairs strictly improves optimization curvature and linear convergence rate; (ii) ignoring cross-cluster entries incurs only a bounded, quantitatively small information loss when the consensus is near block-diagonal. Finally, a correlation-aware central aggregation combines data quantity with a dynamic measure of correlation learning quality, using a dynamic balance factor that transitions from quantity-driven weighting in early rounds to structure-driven weighting later. Extensive experiments under diverse non-IID regimes (varying label distributions, client heterogeneity, and client counts) show consistent gains over federated baselines in mAP/F1/Hamming Loss, with improved stability and communication efficiency.