Skip to yearly menu bar Skip to main content


Poster

Flexi-FSCIL: Adaptive Knowledge Retention for Breaking the Stability-Plasticity Dilemma in Few-Shot Class-Incremental Learning

Wufei Xie · Yalin Wang · Chenliang Liu · Zhaohui Jiang · Xue Yang


Abstract:

Few-Shot Class-Incremental Learning (FSCIL) is challenged by limited data and expanding class spaces, leading to overfitting and catastrophic forgetting. Existing methods, which often freeze feature extractors and use Nearest Class Mean classifiers, sacrifice adaptability to new feature distributions. To address these issues, we propose Flexi-FSCIL, a semi-supervised framework that integrates three novel strategies: Adaptive Gated Residual Fusion (AGRF), Attention-Guided Dynamic Hybrid Distillation (ADHD), and Prototype Offset Equilibrium (POE). Flexi-FSCIL effectively balances stability and plasticity in FSCIL. AGRF resolves the rigidity of frozen feature extractors by integrating both frozen and trainable components, enabling adaptive feature learning while retaining old-class knowledge. ADHD tackles the imbalance between old and new tasks by dynamically aligning features using cross-attention maps and direct matching, preserving old-class knowledge while facilitating new-class learning. POE addresses the issue of prototype drift in semi-supervised settings by selecting high-quality unlabeled samples, maintaining feature space separability and preventing overfitting. Evaluated on three benchmark datasets, Flexi-FSCIL achieves state-of-the-art performance, significantly outperforming existing FSCIL methods with only 12.97 performance drop on CUB200.

Live content is unavailable. Log in and register to view live content