Poster
Estimating 2D Camera Motion with Hybrid Motion Basis
Haipeng Li · Tianhao Zhou · Zhanglei Yang · WuYi WuYi · Chen Yan · Zijing Mao · Shen Cheng · Bing Zeng · Shuaicheng Liu
Estimating 2D camera motion is a fundamental task in computer vision, representing the non-linear projection of 3D rotation and translation onto a 2D plane. Current methods primarily rely on homography-based approaches, which model perspective transformations for planar scenes, or meshflow-based techniques, which utilize grid-based local homographies to accommodate non-linear motion. However, homography is restricted to dominant planes and meshflow’s nonlinear capacity remains limited. To address these challenges, we introduce CamFlow, a novel representation that captures non-linear 2D camera motion through the use of hybrid motion bases: 1) physical bases to model essential motion patterns and 2) noisy motion bases to enhance flexibility. In addition, we propose a hybrid probabilistic loss function, leveraging a Laplace distribution to improve robustness and facilitate efficient training.We also design a test-time adaptation strategy to refine motion estimates for video stabilization in unseen video contexts. To evaluate the camera motion, we propose a new benchmark by masking dynamic objects in existing optical flow datasets. Extensive experiments, including zero-shot evaluations across diverse conditions, demonstrate that CamFlow outperforms state-of-the-art homography and meshflow methods in terms of robustness and generalization.Code and dataset will be released upon publication.
Live content is unavailable. Log in and register to view live content