Skip to yearly menu bar Skip to main content


Poster

Forgetting Through Transforming: Enabling Federated Unlearning via Class-Aware Representation Transformation

Qi Guo · Zhen Tian · Minghao Yao · Saiyu Qi · Yong Qi · Bingyi Liu


Abstract: Federated Unlearning (FU) should satisfy three key requirements: a guarantee of data erasure, preservation of model utility, and reduction of unlearning time. Recent studies focus on identifying and modifying original model parameters relevant to unlearning data. While they can achieve faster unlearning, they degrade the model performance on remaining data or fail to forget unlearning data due to the difficulty in isolating specific parameters of the unlearning data. By revisiting the representation distribution of the optimal unlearning models (i.e., the retrained models), we observe that unlearning data tends to cluster within semantically related categories of remaining data. This inspired us to transform the distribution of unlearning data to fuse with similar categories in the remaining data for effective FU. Based on this insight, we propose a novel framework, named FUCRT, to achieve Federated Unlearning via Class-aware Representation Transformation. FUCRT consists of two key components: (1) a transformation class identification strategy (TCI) that leverages the original model to identify appropriate transformation classes for unlearning data, and (2) a targeted transformation learning process (TTL) with cross-class fusion mechanism to ensure effective and consistent transformation of unlearning data. Extensive experiments on four datasets demonstrate that FUCRT not only achieves 100\% of data erasure but also outperforms state-of-the-art methods by an average of 2.96\% and 3.78\% in utility preservation under IID and Non-IID settings, respectively. Moreover, it reduces unlearning time by 19.13\%$\sim$ 96.38\%.

Live content is unavailable. Log in and register to view live content