Abstract

Objective

Detecting small, faraway objects in real-time surveillance is challenging due to limited pixel representation, affecting classifier performance. Deep Learning (DL) techniques generate feature maps to enhance detection, but conventional methods suffer from high computational costs. To address this, we propose Multi-Scale Region-wise Pixel Analysis with GAN for Tiny Object Detection (MSRP-TODNet). The model is trained and tested on VisDrone VID 2019 and MS-COCO datasets. First, images undergo two-fold pre-processing using Improved Wiener Filter (IWF) for artifact removal and Adjusted Contrast Enhancement Method (ACEM) for blurring correction. The Multi-Agent Reinforcement Learning (MARL) algorithm splits the pre-processed image into four regions, analyzing each pixel to generate feature maps. These are processed by the Enhanced Feature Pyramid Network (EFPN), which merges them into a single feature map. Finally, a Generative Adversarial Network (GAN) detects objects with bounding boxes.

Results

Experimental results on the DOTA dataset demonstrate that MSRP-TODNet outperforms existing state-of-the-art methods. Specifically, it achieves an mAP @0.5 of 84.2%, mAP @0.5:0.95 of 54.1%, and an F1-Score of 84.0%, surpassing improved TPH-YOLOv5, YOLOv7-Tiny, and DRDet by margins of 1.7%–6.1% in detection performance. These results demonstrate the framework’s effectiveness for accurate, real-time small object detection in UAV surveillance and aerial imagery.

Details

Title
MSRP-TODNet: a multi-scale reinforced region wise analyser for tiny object detection
Author
Bikku, Thulasi; K. P. N. V. Satya Sree; Thota, Srinivasarao; Malligunta, Kiran Kumar; Shanmugasundaram, P
Pages
1-11
Section
Research Note
Publication year
2025
Publication date
2025
Publisher
BioMed Central
e-ISSN
17560500
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3201566556
Copyright
© 2025. This work is licensed under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.