研究目的
To propose a robust HDR video synthesis algorithm from alternatively exposed LDR videos using superpixel-based illumination-invariant motion estimation.
研究成果
The proposed algorithm effectively synthesizes high-quality HDR videos by accurately estimating correspondences using a superpixel-based illumination-invariant motion estimation technique, outperforming existing methods in reducing artifacts and preserving object details.
研究不足
The method assumes that regions corresponding to poorly-exposed areas in the reference frame are well-exposed in adjacent frames, which may not hold in all scenarios. It relies on simulated data for evaluation, and real-world applicability might be constrained by computational complexity or unmodeled variations.
1:Experimental Design and Method Selection:
The algorithm involves over-segmenting input frames using superpixel segmentation, employing the DASC feature descriptor for illumination-invariant motion estimation, and using SIFT flow for correspondence estimation in well-exposed regions. Bidirectional motion estimation is used for robustness, and a cost minimization approach is applied in poorly-exposed regions. HDR frames are synthesized by weighted averaging of irradiance maps.
2:Sample Selection and Data Sources:
The evaluation uses two datasets: Hallway2 and Student sequences from the LiU HDRv Repository, with alternating frames generated by simulating image acquisition using a camera response function.
3:List of Experimental Equipment and Materials:
No specific equipment or materials are listed; the method is computational and relies on software implementations.
4:Experimental Procedures and Operational Workflow:
Steps include frame segmentation, motion estimation in well- and poorly-exposed regions, correspondence validation, and HDR frame synthesis via weighted averaging.
5:Data Analysis Methods:
Performance is compared with state-of-the-art algorithms (Kalantari et al. and Li et al.) through visual inspection of synthesized frames for artifacts and detail preservation.
独家科研数据包,助您复现前沿成果,加速创新突破
获取完整内容