Poster
PEFTDiff: Diffusion-Guided Transferability Estimation for Parameter-Efficient Fine-Tuning
PRAFFUL KHOBA · Zijian Wang · Chetan Arora · Mahsa Baktashmotlagh
[
Abstract
]
Abstract:
Selecting an optimal Parameter-Efficient Fine-Tuning (PEFT) technique for a downstream task is a fundamental challenge in transfer learning. Unlike full fine-tuning, where all model parameters are updated, PEFT techniques modify only a small subset of parameters while keeping the backbone frozen, making them computationally efficient. However, this introduces a unique problem: selecting the most effective PEFT method for a given dataset. Existing transferability estimation (TE) metrics primarily focus on ranking distinct architectures and struggle to detect subtle embedding differences introduced by various PEFT methods sharing the same backbone. To address this limitation, we propose a novel diffusion-based metric explicitly designed for PEFT selection. Unlike conventional metrics, our approach models the fine-grained geometric relationships of embedding spaces through a diffusion process, effectively quantifying intra-class compactness and inter-class separability. Extensive evaluations on the VTAB-1k benchmark validate our method’s effectiveness, demonstrating a substantial 68.95\% improvement over LogME, 1297.29\% over $\mathcal{N}$LEEP, 149.75\% over NCTI, and 140.46\% over SFDA—four widely used TE methods designed for ranking pre-trained models.
Live content is unavailable. Log in and register to view live content