Skip to yearly menu bar Skip to main content


Poster

E-NeMF: Event-based Neural Motion Field for Novel Space-time View Synthesis of Dynamic Scenes

Yan Liu · Zehao Chen · Haojie Yan · De Ma · Huajin Tang · Qian Zheng · Gang Pan


Abstract:

Synthesizing novel space-time views from a monocular video is a highly ill-posed problem, and its effectiveness relies on accurately reconstructing motion and appearance of the dynamic scene.Frame-based methods for novel space-time view synthesis in dynamic scenes rely on simplistic motion assumptions due to the absence of inter-frame cues, which makes them fall in complex motion. Event camera captures inter-frame cues with high temporal resolution, which makes it hold the promising potential to handle high speed and complex motion. However, it is still difficult due to the event noise and sparsity. To mitigate the impact caused by event noise, we propose E-NeMF, which alleviates the impact of event noise with Parametric Motion Representation and mitigates the event sparsity with Flow Prediction Module. Experiments on multiple real-world datasets demonstrate our superior performance in handling high-speed and complex motion.

Live content is unavailable. Log in and register to view live content