研究目的
To evaluate the capacity and quality of three-dimensional reconstruction of static targets using the ZED stereoscopic camera.
研究成果
The ZED camera can achieve satisfactory 3D reconstructions with errors of a few centimeters under optimal conditions (opaque, texture-rich surfaces, appropriate distance and speed). Future work should involve georeferenced comparisons and evaluations with other technologies like Kinect.
研究不足
Reconstruction issues with reflective surfaces, homogeneous textures, and distances beyond 5-10 meters; shape distortions and unreconstructed areas in certain conditions; reliance on software alignment for comparison rather than georeferenced models.
1:Experimental Design and Method Selection:
The study involved capturing images with a ZED stereoscopic camera under various conditions (different surfaces, textures, lighting, distances, and acquisition speeds) and comparing the 3D reconstructions with high-precision point clouds from a Leica Viva TS15 total station. Data processing was done using CloudCompare software to calculate displacements between models.
2:Sample Selection and Data Sources:
Static targets including objects (e.g., sculptures) and environments (e.g., parking lots, lawns) were selected based on varying characteristics to test reconstruction quality.
3:List of Experimental Equipment and Materials:
ZED stereoscopic camera, Leica Viva TS15 total station, computer with ZED SDK and CloudCompare software, calibration targets.
4:Experimental Procedures and Operational Workflow:
Calibrate the ZED camera using ZED Calibration software. Capture images by walking around objects or through environments at specified distances and speeds. Acquire point clouds with the total station using defined grids. Process data in CloudCompare to align models and compute errors.
5:Data Analysis Methods:
Statistical analysis of displacements (e.g., mean, standard deviation) between camera-generated meshes and total station point clouds, visualized through histograms and distance maps.
独家科研数据包,助您复现前沿成果,加速创新突破
获取完整内容