研究目的
To save resource and time by proposing a method for fisheye object detection and localization on unrestored fisheye images for MAVs, using a single-stage neural network and data fusion with MAV sensors.
研究成果
The proposed methods for fisheye object detection and localization are effective and real-time capable, validated through experiments. The detector achieves high accuracy with optimized speed, and localization benefits from data fusion, though accuracy decreases with distance and attitude changes. Future improvements should focus on dataset expansion and edge case handling.
研究不足
The method is not tested for objects in areas larger than ±60 degrees due to court restrictions. Future work should address building a larger dataset and updating the algorithm for edge objects.
1:Experimental Design and Method Selection:
The methodology involves designing a single-stage neural network based on RetinaNet and MobileNet for object detection on fisheye images, incorporating depthwise separable convolutions to reduce computational complexity. A camera model is used for localization, and data fusion with MAV sensors (altitude, attitude) is employed.
2:Sample Selection and Data Sources:
A custom dataset of 1468 fisheye images (960x960 resolution) is built, labeled with two classes (iRobot and obstacle), divided into training (1000), validation (234), and test (234) sets.
3:List of Experimental Equipment and Materials:
MAV platform (DJI M100), fisheye camera (SY091HD), onboard computer (NVIDIA TX2), ultrasonic sensor, IMU, and for training, an NVIDIA TitanX GPU.
4:Experimental Procedures and Operational Workflow:
Images are processed by the detector to output bounding boxes; localization uses the fisheye model and sensor data with Gauss-Newton iteration. Training involves pre-training on Pascal VOC, fine-tuning with data augmentation (flipping, rotation), and loss minimization.
5:Data Analysis Methods:
Accuracy is measured using mAP@0.5 IOU, runtime is recorded, and localization error is analyzed with linear fitting and comparison to ground truth from an OptiTrack system.
独家科研数据包,助您复现前沿成果,加速创新突破
获取完整内容