研究目的
To solve problems in depth image based rendering (DIBR) such as depth edge misalignment, disocclusion occurrences, and cracks at resampling for high-quality 2D to 3D conversion.
研究成果
The proposed scheme effectively handles depth edge misalignment, disocclusion, and resampling cracks in DIBR through adaptive domain transform filtering, achieving high efficiency and good virtual view quality, making it suitable for 2D to 3D conversion applications. Future work will extend to 3D video synthesis.
研究不足
The scheme is limited to stereoscopic view generation from a single texture image and depth map; it does not handle temporal correlation for video applications, and depth maps must be pre-generated with some depth cues.
1:Experimental Design and Method Selection:
The methodology involves a domain transform based filtering framework with two adaptive filters applied sequentially before and after depth map warping. It uses backward texture warping for virtual view synthesis.
2:Sample Selection and Data Sources:
Five test sets (Flower, Castle, Orbi, Desktop, Sculpture) with different spatial resolutions and depth cues (e.g., depth from motion, structure from motion) are used for evaluation.
3:List of Experimental Equipment and Materials:
A commodity PC with an Intel Core2 Quad CPU Q9400 2.66GHz, implemented in Microsoft Visual Studio C++ 2008 and MATLAB for domain transform module.
4:66GHz, implemented in Microsoft Visual Studio C++ 2008 and MATLAB for domain transform module.
Experimental Procedures and Operational Workflow:
4. Experimental Procedures and Operational Workflow: First, apply a domain transform filter to the original depth map for boundary alignment and disocclusion reduction. Then, warp the depth map to the virtual view and apply a second domain transform filter with scene gradient constraints to reduce cracks and noise. Finally, use backward warping to synthesize the virtual texture view.
5:Data Analysis Methods:
Subjective quality evaluation by 15 individuals scoring image quality and stereoscopic feeling, and computation time comparisons.
独家科研数据包,助您复现前沿成果,加速创新突破
获取完整内容