Skip to yearly menu bar Skip to main content


Poster

Federated Representation Angle Learning

Liping Yi · Han Yu · Gang Wang · xiaoguang Liu · Xiaoxiao Li


Abstract:

Model-heterogeneous federated learning (MHFL) is a challenging FL paradigm designed to allow FL clients to train structurally heterogeneous models under the coordination of an FL server. Existing MHFL methods face significant limitations when it comes to transferring global knowledge to clients as a result of sharing only partial homogeneous model parameters or calculating distance loss, leading to inferior model generalization. To bridge this gap, we propose a novel model-heterogeneous Federated learning method with Representation Angle Learning (FedRAL). It consists of three innovative designs: (1) We first introduce representation angle learning into MHFL. Specifically, we embed a homogeneous square matrix into the local heterogeneous model of each client, which learns the angle information of local representations. These homogeneous representation angle square matrices are aggregated on the server to fuse representation angle knowledge shared by clients for enhancing the generalization of local representations. (2) As different clients might have heterogeneous system resources, we propose an adaptive diagonal sparsification strategy to reduce the numbers of the parameters of representation angle square matrices uploaded to the server, to improve FL communication efficiency. (3) To enable the effective fusion of sparsified homogeneous local representation angle square matrices, we design an element-wise weighted aggregation approach. Experiments on 4 benchmark datasets under 2 types of non-IID divisions over 6 state-of-the-art baselines demonstrate that FedRAL achieves the best performance. It improves test accuracy, communication efficiency and computational efficiency by up to 5.03%, 12.43× and 6.49×, respectively.

Live content is unavailable. Log in and register to view live content