Smart Traffic Monitoring Through Real-Time Moving Vehicle Detection Using Deep Learning via Aerial Images for Consumer Application

Avaneesh Singh, Mohammad Zia Ur Rahma, Preeti Rani, Navin Kumar Agrawal, Rohit Sharma, Elham Kariri, Daniel Gavilanes Aray

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

This paper presents a novel deep-learning method for detecting and tracking vehicles in autonomous driving scenarios, with a focus on vehicle failure situations. The primary objective is to enhance road safety by accurately identifying and monitoring vehicles. Our approach combines YOLOv8 models with Transformers-based convolutional neural networks (CNNs) to address the limitations of traditional CNNs in capturing high-level semantic information. A key contribution is the integration of a modified pyramid pooling model for real-time vehicle detection and kernelized filter-based techniques for efficient vehicle tracking with minimal human intervention. The proposed method demonstrates significant improvements in detection accuracy, with experimental results showing increases of 4.50%, 4.46%, and 3.59% on the DLR3K, VEDAI, and VAID datasets, respectively. Our qualitative and quantitative analysis highlights the model's robustness in handling shadows and occlusions in traffic scenes, outperforming several existing methods. This research contributes a more effective solution for real-time multi-vehicle detection and tracking in autonomous driving systems.

Original languageEnglish
Pages (from-to)7302-7309
Number of pages8
JournalIEEE Transactions on Consumer Electronics
Volume70
Issue number4
DOIs
StatePublished - 2024

Keywords

  • consumer application
  • Deep learning
  • imaging technologies
  • intelligent transportation systems
  • object detection
  • tracking

Fingerprint

Dive into the research topics of 'Smart Traffic Monitoring Through Real-Time Moving Vehicle Detection Using Deep Learning via Aerial Images for Consumer Application'. Together they form a unique fingerprint.

Cite this