研究目的
To develop a novel supervised change detection method for optical aerial images using a deep siamese semantic network trained with an improved triplet loss function to enhance feature extraction and semantic relation learning.
研究成果
The proposed method effectively detects changes in optical aerial images by leveraging a deep siamese semantic network and improved triplet loss, resulting in enhanced feature robustness and semantic relation learning. It outperforms state-of-the-art methods in terms of F-measure, demonstrating its suitability for multiscale change regions. Future research should focus on improving separability with semi-supervised or unsupervised approaches and refining threshold segmentation.
研究不足
The method requires supervised training with labeled data, which may be time-consuming and not applicable to unsupervised scenarios. The threshold segmentation is simple and might not be optimal for all cases. Future work could explore semi-supervised or unsupervised techniques and better threshold methods.