Skip to yearly menu bar Skip to main content


Poster

Federated domain generalization with domain-specific soft prompts generation

Jianhan Wu · Xiaoyang Qu · Zhangcheng Huang · Jianzong Wang


Abstract:

Prompt learning has become an efficient paradigm for adapting CLIP to downstream tasks. Compared with traditional fine-tuning, prompt learning optimizes a few parameters yet yields highly competitive results, especially appealing in federated learning for computational efficiency. In federated learning scenarios, data across different clients is often non-IID., leading to domain shift among clients, which poses a formidable challenge to the adaptation of downstream tasks. Federated domain generalization (FDG) methods typically learn fixed or residual soft prompts from training samples, replacing manually designed prompts to enhance the generalization ability of federated models. However, these learned prompts lack diversity and tend to ignore information about unknown domains. We propose a novel and effective method from a generative perspective for handling FDG tasks, namely federated domain generalization with domain-specific soft prompts generation (FedDSPG). Specifically, in the training phase, we introduce domain-specific soft prompts (DSPs) for each domain and integrate domain and content knowledge into the generative model among clients. In the inference phase, the generator is utilized to obtain DSPs for unseen target domains, thus guiding downstream tasks in unknown domains. Extensive experiments on several public datasets show that our method achieves state-of-the-art performance compared with the strong baselines in FDG.

Live content is unavailable. Log in and register to view live content