研究目的
To address the challenges of autonomous landing of a UAV on a moving UGV in GPS-denied environments, including low accuracy of target location, poor precision of relative motion estimation, delayed control responses, slow processing speeds, and poor stability, by developing a system that uses a hybrid camera array for accurate detection, tracking, and state estimation, and a nonlinear controller for precise landing.
研究成果
The proposed hybrid camera array-based system effectively enables autonomous landing of a UAV on a moving UGV in GPS-denied environments by fusing wide FOV and depth information for accurate target localization and state estimation, combined with a robust nonlinear controller. Simulations and real experiments demonstrated high precision, robustness, and suitability for practical applications, though improvements in feature extraction and processing speed are needed for future work.
研究不足
1. The motion compensation relies on SIFT feature point extraction, which can fail in low-texture or white backgrounds, reducing speed accuracy and potentially causing landing failure.
2. The system updates UAV speed every 100 ms due to onboard processing limits, and UAV speed changes take time; if the UGV changes direction rapidly at low altitude, it may disappear from the UAV's view, leading to tracking loss.
1:Experimental Design and Method Selection:
The study designed a UAV autonomous landing system using a hybrid camera array (fisheye lens camera and stereo camera) for wide FOV and depth imaging. It employed a state estimation algorithm with motion compensation and a nonlinear PID controller for UAV control. Methods include YOLOv3 for target detection, tracking-by-detection, SIFT-GPU for feature matching, homography for motion compensation, and stereo depth estimation with Kalman filtering.
2:Sample Selection and Data Sources:
The UGV was equipped with an ArUco marker for detection. Training datasets (UGV_Train_Data with over 102,003 images) and test datasets (UGV_test_Data with over 5000 images) were used for algorithm evaluation. Simulations were conducted in AirSim, and real experiments used a DJI-MATRICE 100 UAV and a remote-control UGV.
3:List of Experimental Equipment and Materials:
Equipment includes a fisheye lens camera (Point Grey GS3-U3-41C6NIR-C with Kowa LM4NCL lens), stereo camera (MYNT EYE S1010-IR), NVIDIA Jetson TX2 processor, DJI-MATRICE 100 UAV, and Parrot Bebop drone. Software tools include OpenCV, ROS, YOLOv3, and AirSim.
4:Experimental Procedures and Operational Workflow:
The system acquires fisheye and depth images, detects and tracks the UGV using YOLOv3 and tracking algorithms, estimates motion state via motion compensation and velocity observer, and controls the UAV with a two-stage PID controller. Experiments involve simulations in AirSim and real-world tests with varying UGV trajectories.
5:Data Analysis Methods:
Performance was evaluated through qualitative and quantitative analyses, including comparison with ground truth in simulations, precision, recall, F1-measure for detection, and error analysis for altitude estimation and velocity.
独家科研数据包,助您复现前沿成果,加速创新突破
获取完整内容