Skip to yearly menu bar Skip to main content


Poster

BokehDiff: Neural Lens Blur with One-Step Diffusion

Chengxuan Zhu · Qingnan Fan · Qi Zhang · Jinwei Chen · Huaqi Zhang · Chao Xu · Boxin Shi


Abstract:

We introduce a novel lens blur rendering approach with the help of generative diffusion prior, to achieve physically accurate outcomes. Previous lens blur methods are bounded by the accuracy of depth estimation methods, thus introducing artifacts in depth discontinuities. Our method employs a physics-inspired self-attention module that aligns with the image formation process, incorporating depth-dependent circle of confusion constraint and self-occlusion effects. We adapt the diffusion model to the one-step inference scheme without introducing additional noise, and achieves results of high quality and fidelity. To address the lack of scalable paired training data, we propose to synthesize photorealistic foregrounds with transparency with diffusion models, balancing image authenticity and scene diversity.

Live content is unavailable. Log in and register to view live content