OBJECT DETECTION IN UAV-BORNE THERMAL IMAGES USING BOUNDARY-AWARE SALIENCY MAPS
In this paper, we propose a method of object detection based on thermal images acquired from unmanned aerial vehicles (UAV). Compared with visible images, thermal images have lower requirements for illumination conditions, but they have some problems, such as blurred edges and low contrast. To address these problems, we propose to use the saliency map of thermal images for image enhancement as the attention mechanism of the object detector. In the paper, the YOLOv3 network is trained as a detection benchmark and BASNet is used to generate saliency maps from the thermal images. We fuse the thermal images with their corresponding saliency maps through the pixel-level weighted fusion method. Experiment results tested on real data have shown that the proposed method could realize the task of object detection in UAV-borne thermal images. The statistical results show that the average precisions (AP) of pedestrians and vehicles are increased by 4.5% and 2.6% respectively, compared with the benchmark of the YOLOv3 model trained on only the thermal images. The proposed model provides reliable technical support for the application of thermal images with UAV platforms.