TY - JOUR
T1 - Artificial Intelligence-driven Gesture Recognition for Hearing-impaired People Using Reptile Search Algorithm with Deep Learning Model
AU - Assiri, Mohammed
AU - Mohamed Saleh, Mahmoud
N1 - Publisher Copyright:
© 2025 The Author(s).
PY - 2025/8/4
Y1 - 2025/8/4
N2 - Automated gesture recognition (GR) is applied to present feedback in a computerized learning environment for practising sign language (SL) knowledge for the deaf community. Globally, dumb and deaf people face challenges expressing their feelings to others. Hearing-and speech-impaired individuals also encounter challenges expressing themselves in public spaces. To address this, a smart system powered by artificial intelligence (AI) was developed to assist and enhance communication for the deaf community. The performance of applications is problematic because of the enormous amount of SLs. Current developments in AI and machine learning have assisted in automating and improving such systems. This study proposes an AI-driven GR for hearing-impaired people using a reptile search algorithm with deep learning (AIGRHIP-RSADL) approach. The AIGRHIP-RSADL approach presents a robust framework for recognizing and classifying hand gestures using deep learning and hyperparameter techniques. Initially, the AIGRHIP-RSADL technique applies bilateral filtering (BF) to remove noise and enhance gesture image quality. Moreover, the InceptionResNetV2 model captures intricate gesture details and improves feature representation. The variational autoencoder (VAE) model classifies and recognizes gestures. Finally, the hyperparameter tuning of the VAE model is performed by implementing the improved reptile search algorithm model to ensure optimal performance through efficient search within complex parameter spaces. A complete simulation study was conducted to show the improved performance of the AIGRHIP-RSADL model. The comparison analysis of the AIGRHIP-RSADL model demonstrated a superior accuracy of 98.10% over existing techniques.
AB - Automated gesture recognition (GR) is applied to present feedback in a computerized learning environment for practising sign language (SL) knowledge for the deaf community. Globally, dumb and deaf people face challenges expressing their feelings to others. Hearing-and speech-impaired individuals also encounter challenges expressing themselves in public spaces. To address this, a smart system powered by artificial intelligence (AI) was developed to assist and enhance communication for the deaf community. The performance of applications is problematic because of the enormous amount of SLs. Current developments in AI and machine learning have assisted in automating and improving such systems. This study proposes an AI-driven GR for hearing-impaired people using a reptile search algorithm with deep learning (AIGRHIP-RSADL) approach. The AIGRHIP-RSADL approach presents a robust framework for recognizing and classifying hand gestures using deep learning and hyperparameter techniques. Initially, the AIGRHIP-RSADL technique applies bilateral filtering (BF) to remove noise and enhance gesture image quality. Moreover, the InceptionResNetV2 model captures intricate gesture details and improves feature representation. The variational autoencoder (VAE) model classifies and recognizes gestures. Finally, the hyperparameter tuning of the VAE model is performed by implementing the improved reptile search algorithm model to ensure optimal performance through efficient search within complex parameter spaces. A complete simulation study was conducted to show the improved performance of the AIGRHIP-RSADL model. The comparison analysis of the AIGRHIP-RSADL model demonstrated a superior accuracy of 98.10% over existing techniques.
KW - Deep learning
KW - Gesture recognition
KW - Hearing-impaired people
KW - Reptile search algorithm
UR - http://www.scopus.com/inward/record.url?scp=105013236583&partnerID=8YFLogxK
U2 - 10.57197/JDR-2025-0657
DO - 10.57197/JDR-2025-0657
M3 - Article
AN - SCOPUS:105013236583
SN - 2676-2633
VL - 4
JO - Journal of Disability Research
JF - Journal of Disability Research
IS - 4
M1 - e20250657
ER -