Iraqi Sign Language Translator system using Deep Learning

Authors

  • rajaa mohammed Computer Science Department, Collage of Sciences for Women, University of Baghdad, Iraq
  • Suhad M. Kadhem Computer Science Department, University of Technology, Baghdad, Iraq.

DOI:

https://doi.org/10.55145/ajest.2023.01.01.0013

Keywords:

Sign language, Iraqi sign language, Video processing, Deep learning, CNN

Abstract

The deaf and mute use sign language by moving their hands, faces, and bodies to talk to each other or normal people. Sign language is non-verbal communication, which is the process of communication by sending and receiving messages without words between people. The number of deaf people is increasing in the world and Iraq in particular, in addition to the problems in communicating with the world and the difficulty of learning sign language by deaf and hard-of-hearing families, there must be other ways to help the deaf communicate efficiently with ordinary people and learn sign language easily. One such method is to use artificial intelligence to create translation software and recognize hand gestures. This paper presents a computer program that can translate Iraqi sign language into Arabic (text). First, the translation starts with capturing videos to make up the dataset, the proposed system uses a convolutional neural network (CNN) to classify sign language based on its features to impute the meaning of the sign. The accuracy of the part of the proposed system that translates sign language into Arabic text is 99% for words sign.

Downloads

Published

2023-01-23

How to Cite

mohammed, rajaa, & M. Kadhem, S. (2023). Iraqi Sign Language Translator system using Deep Learning. Al-Salam Journal for Engineering and Technology, 2(1), 109–116. https://doi.org/10.55145/ajest.2023.01.01.0013

Issue

Section

Articles