Event-guided Unified Framework for Low-light Video Enhancement, Frame Interpolation, and Deblurring
Abstract
In low-light environments, a longer exposure time is generally required to enhance image visibility; however, this setting inevitably causes motion blur. Even with a long exposure time, videos captured in low-light environments still suffer from issues such as low visibility, low contrast, and color distortion. Additionally, the long exposure time results in videos with a low frame rate. Therefore, videos captured in low-light exhibit low visibility and motion blur, as well as low frame rates. To overcome these limitations, we propose a novel problem aimed at transforming motion-blurred, low-frame-rate videos with poor visibility in low-light environments into high-frame-rate videos while simultaneously enhancing their visibility. To tackle this challenge, we leverage the unique advantages of event cameras, which capture scene changes asynchronously, providing superior temporal resolution and a wider dynamic range compared to conventional frame-based cameras. These properties make event cameras particularly effective in reducing motion blur, compensating for low frame rates, and enhancing visibility in low-light conditions. To this end, we developed a hybrid camera system that integrates two RGB cameras and an event camera, capturing a dedicated dataset for this task and proposing novel network architectures to effectively address this problem. For future work, we plan to release the code and dataset upon acceptance.