Poster
Integrating Task-Specific and Universal Adapters for Pre-Trained Model-Based Class-Incremental Learning
yan wang · Da-Wei Zhou · Han-Jia Ye
Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Despite Pre-trained Models (PTMs) have shown excellent performance in CIL, catastrophic forgetting still occurs as the model learns new concepts. Existing methods often freeze the pre-trained network and adapt to incremental tasks using additional lightweight modules. At inference time, the model must accurately identify the most suitable module, but errors in retrieving irrelevant modules can lead to a decline in performance. Additionally, the selected module concentrates solely on task-specific knowledge and neglects the general knowledge shared across tasks, so it is prone to make erroneous predictions when it is presented with several similar classes from different tasks. To address the aforementioned challenges, we propose integrating Task-Specific and Universal Adapters (TUNA) in this paper. Specifically, we design an orthogonal mechanism to train task-specific adapters, so that they can capture the most crucial features relevant to their respective tasks. Furthermore, we introduce an adapter fusion strategy to construct a universal adapter, which encodes the shared general knowledge across tasks. During inference, we combine predictions from both the task-specific adapter and the universal adapter to effectively utilize both specialized and general knowledge. Extensive experiments on various benchmark datasets demonstrate the state-of-the-art performance of our approach.
Live content is unavailable. Log in and register to view live content