Skip to yearly menu bar Skip to main content


Poster

Diffusion-based 3D Hand Motion Recovery with Intuitive Physics

Yufei Zhang · Zijun Cui · Jeffrey Kephart · Qiang Ji


Abstract:

While 3D hand reconstruction from monocular images has made significant progress, generating accurate and temporally coherent motion estimates from video sequences remains challenging, particularly during complex hand-object interactions. In this paper, we present a novel 3D hand motion recovery framework that enhances image-based reconstructions through a diffusion-based and physics-augmented motion refinement model. Our model captures the distribution of refined motion estimates conditioned on initial ones, generating improved sequences through an iterative denoising process. Instead of relying on scarce annotated video data, we train our model only using existing motion capture data without images. Moreover, we identify valuable intuitive physics knowledge during hand-object interactions, including key motion states and their associated motion constraints. We effectively integrate these physical insights into our diffusion model to improve its performance. Extensive experiments demonstrate that our approach significantly improves various frame-wise reconstruction methods, achieving state-of-the-art (SOTA) performance on existing benchmarks.

Live content is unavailable. Log in and register to view live content