Deep Learning Enabled Garbage Classification and Detection by Visual Context for Aerial Images

Agnivesh Pandey, Rohit Raja, Manoj Gupta, Farhan A. Alenizi, Pannee Suanpang, Aziz Nanthaamornphong

Research output: Contribution to journalArticlepeer-review

Abstract

Environmental pollution caused by garbage is a significant problem in most developing countries. Proper garbage waste processing, management, and recycling are crucial for both ecological and economic reasons. Computer vision techniques have shown advanced capabilities in various applications, including object detection and classification. In this study, we conducted an extensive review of the use of artificial intelligence for garbage processing and management. However, a major limitation in this field is the lack of datasets containing top-view images of garbage. We introduce a new dataset named “KACHARA,” containing 4727 images categorized into seven classes: clothes, decomposable (organic waste), glass, metal, paper, plastic, and wood. Importantly, the dataset exhibits a moderate imbalance, mirroring the distribution of real-world garbage, which is crucial for training accurate classification models. For classification, we utilize transfer learning with the well-known deep learning model MobileNetV3-Large, where the top layers are fine-tuned to enhance performance. We achieved a classification accuracy of 94.37% and also evaluated performance using precision, recall, F1-score, and confusion matrix. These results demonstrate the model’s strong generalization in aerial/top-view garbage classification.

Original languageEnglish
Article number9106130
JournalApplied Computational Intelligence and Soft Computing
Volume2025
Issue number1
DOIs
StatePublished - 2025

Keywords

  • aerial images
  • deep learning
  • object classification
  • object detection
  • transfer learning

Fingerprint

Dive into the research topics of 'Deep Learning Enabled Garbage Classification and Detection by Visual Context for Aerial Images'. Together they form a unique fingerprint.

Cite this