Poster
PixelStitch: Structure-Preserving Pixel-Wise Bidirectional Warps for Unsupervised Image Stitching
Hengzhe Jin · Lang Nie · Chunyu Lin · Xiaomei Feng · Yao Zhao
[
Abstract
]
Abstract:
We propose $\textit{PixelStitch}$, a pixel-wise bidirectional warp that learns to stitch images as well as preserve structure in an unsupervised paradigm. To produce natural stitched images, we first determine the middle plane through homography decomposition and globally project the original images toward the desired plane. Compared with unidirectional homography transformation, it evenly spreads projective distortion across two views and decreases the proportion of invalid pixels. Then, the bidirectional optical flow fields are established to carry out residual pixel-wise deformation with projection-weighted natural coefficients, encouraging pixel motions to be as unnoticeable as possible in non-overlapping regions while smoothly transitioning into overlapping areas. Crucially, this flexible deformation enables $\textit{PixelStitch}$ to align large-parallax images and preserve the structural integrity of non-overlapping contents. To obtain high-quality stitched images in the absence of labels, a comprehensive unsupervised objective function is proposed to simultaneously encourage content alignment, structure preservation, and bidirectional consistency. Finally, extensive experiments are conducted to show our superiority to existing state-of-the-art (SoTA) methods in the quantitative metric, qualitative appearance, and generalization ability. The code will be available.
Live content is unavailable. Log in and register to view live content