Poster
Contrastive Flow Matching
George Stoica · Vivek Ramanujan · Xiang Fan · Ali Farhadi · Ranjay Krishna · Judy Hoffman
Unconditional flow-matching trains diffusion models to efficiently transport samples from a source distribution to samples of target distribution by enforcing that the flows between sample pairs from the source and target distributions are unique. However, in conditional settings (e.g., class-conditioned models), this uniqueness is no longer guaranteed—flows from different conditions may overlap, leading to more ambiguous generations. We introduce Contrastive Flow Matching (CFM) an extension to the flow-matching objective that explicitly enforces uniqueness across all conditional flows, enhancing condition separation. Our approach adds a contrastive objective that maximizes dissimilarities between predicted flows from arbitrary sample pairs. We validate Contrastive Flow Matching by conducting extensive experiments across varying SiT model sizes on the popular ImageNet-1 (256x256) and (512x512) benchmarks.Notably, we find that training models with CFM (1) improves training speed by a factor of up to 2x, (2) requires up to 5x fewer de-noising steps and (3) lowers FID by up to 8.9 compared to training the same models with flow-matching.We commit to releasing our code upon publication.
Live content is unavailable. Log in and register to view live content