Skip to yearly menu bar Skip to main content


Poster

Federated Continuous Category Discovery and Learning

Lixu Wang · Chenxi Liu · Junfeng Guo · Qingqing Ye · Heng Huang · Haibo Hu · Wei Dong


Abstract:

Federated Learning (FL) studies often assume a static data distribution, whereas real-world scenarios involve dynamic changes. To address this gap, we study Federated Continuous Category Discovery and Learning (FC^2DL)---an essential yet underexplored problem that enables FL models to evolve continuously by discovering and learning novel data categories. The key challenge in FC^2DL lies in merging and aligning categories discovered and learned by different clients, all while maintaining privacy. To tackle this, we propose the Global Prototype Alignment (GPA) framework. GPA first estimates the number of categories and constructs global prototypes by locating high-density regions in the representation space through bi-level clustering. To mitigate pseudo-label noise, GPA then employs a semantic-weighted loss to capture correlations between global prototypes and the novel data. This semantic weighting strategy is also used for contrastive loss, facilitating unsupervised novel-category learning. Besides, GPA incorporates a mixup-based mechanism for both data and models, effectively mitigating interference between known and novel categories while alleviating forgetting. Extensive experiments across multiple datasets demonstrate GPA’s superiority over state-of-the-art baseline approaches. Notably, GPA achieves absolute gains of 5.7\% to 13.1\% in novel category accuracy while preserving known category performance. Furthermore, GPA is highly adaptable, equipping various mainstream FL algorithms with category discovery and learning capabilities, underscoring its potential for real-world deployment.

Live content is unavailable. Log in and register to view live content