Skip to yearly menu bar Skip to main content


Poster

Class-Wise Federated Averaging for Efficient Personalization

Gyuejeong Lee · Daeyoung Choi


Abstract: Federated learning (FL) enables collaborative model training across distributed clients without centralizing data. However, existing approaches like Federated Averaging ($\texttt{FedAvg}$) often perform poorly with heterogeneous data distributions, failing to achieve personalization due to their inability to capture class-specific information effectively.To overcome $\texttt{FedAvg}$'s personalization limitations, we propose Class-wise Federated Averaging ($\texttt{cwFedAvg}$), a novel personalized FL (PFL) framework that performs Federated Averaging for each class.$\texttt{cwFedAvg}$ creates class-specific global models via weighted aggregation of local models using class distributions, then combines them to generate personalized local models.To facilitate effective class-wise aggregation, we further propose Weight Distribution Regularizer ($\texttt{WDR}$), which encourages deep networks to encode class-specific information efficiently by aligning empirical and approximated class distributions derived from output layer weights.Our experiments demonstrate $\texttt{cwFedAvg}$'s superior performance over existing PFL methods through efficient personalization while maintaining $\texttt{FedAvg}$'s communication cost and avoiding additional local training and pairwise computations.

Live content is unavailable. Log in and register to view live content