Treffer: Robust Object Detection for UAVs in Foggy Environments with Spatial-Edge Fusion and Dynamic Task Alignment.
Weitere Informationen
Highlights: What are the main findings? We introduce Fog-UAVNet, a lightweight detector with a unified design that fuses edge and spatial cues, adapts its receptive field to fog density, and aligns classification with localization. Across multiple fog benchmarks, Fog-UAVNet consistently achieves higher detection accuracy and more efficient inference than strong baselines, leading to a superior accuracy–efficiency trade-off under foggy conditions. What are the implications of the main findings? Robust, real-time UAV perception is feasible without large models, enabling practical onboard deployment. The design offers a simple recipe for adverse weather detection and may generalize across aerial scenarios. Robust scene perception in adverse environmental conditions, particularly under dense fog, presents a persistent and fundamental challenge to the reliability of object detection systems. To address this critical challenge, we propose Fog-UAVNet, a novel lightweight deep-learning architecture designed to enhance unmanned aerial vehicle (UAV) object detection performance in foggy environments. Fog-UAVNet incorporates three key innovations: the Spatial-Edge Feature Fusion Module (SEFFM), which enhances feature extraction by effectively integrating edge and spatial information, the Frequency-Adaptive Dilated Convolution (FADC), which dynamically adjusts to fog density variations and further enhances feature representation under adverse conditions, and the Dynamic Task-Aligned Head (DTAH), which dynamically aligns localization and classification tasks and thus improves overall model performance. To evaluate the effectiveness of our approach, we independently constructed a real-world foggy dataset and synthesized the VisDrone-fog dataset using an atmospheric scattering model. Extensive experiments on multiple challenging datasets demonstrate that Fog-UAVNet consistently outperforms state-of-the-art methods in both detection accuracy and computational efficiency, highlighting its potential for enhancing robust visual perception under adverse weather. [ABSTRACT FROM AUTHOR]
Copyright of Remote Sensing is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)