Skip to yearly menu bar Skip to main content


Poster

Cooperative Pseudo Labeling for Unsupervised Federated Classification

Kuangpu Guo · Lijun Sheng · Yongcan Yu · Jian Liang · Zilei Wang · Ran He


Abstract:

Unsupervised federated learning (UFL) aims to collaboratively train a global model across distributed clients without data sharing and label information.Previous UFL works have predominantly focused on representation learning and clustering tasks.Recently, vision language models (e.g., CLIP) have gained significant attention for their attractive zero-shot prediction capabilities.Leveraging this advancement, classification problems that were previously infeasible under the UFL paradigm now present new opportunities but remain largely unexplored.In this paper, we extend UFL to the classification problem with CLIP for the first time and propose a novel method, Federated Cooperative Pseudo Labeling (FedCoPL). Specifically, clients estimate and upload their pseudo label distribution, and the server adjusts and redistributes them to avoid global imbalance among categories.Moreover, we introduce a partial prompt aggregation protocol for effective collaboration and personalization.In particular, visual prompts containing general image features are aggregated at the server, while text prompts encoding personalized knowledge are retained locally.Extensive experiments on six datasets demonstrate the superior performance of our FedCoPL compared to baseline methods.Our code is available in the supplementary materials.

Live content is unavailable. Log in and register to view live content