Iraqi Sign Language Translator system using Deep Learning
DOI:
https://doi.org/10.55145/ajest.2023.01.01.0013Keywords:
Sign language, Iraqi sign language, Video processing, Deep learning, CNNAbstract
The deaf and mute use sign language by moving their hands, faces, and bodies to talk to each other or normal people. Sign language is non-verbal communication, which is the process of communication by sending and receiving messages without words between people. The number of deaf people is increasing in the world and Iraq in particular, in addition to the problems in communicating with the world and the difficulty of learning sign language by deaf and hard-of-hearing families, there must be other ways to help the deaf communicate efficiently with ordinary people and learn sign language easily. One such method is to use artificial intelligence to create translation software and recognize hand gestures. This paper presents a computer program that can translate Iraqi sign language into Arabic (text). First, the translation starts with capturing videos to make up the dataset, the proposed system uses a convolutional neural network (CNN) to classify sign language based on its features to impute the meaning of the sign. The accuracy of the part of the proposed system that translates sign language into Arabic text is 99% for words sign.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 rajaa mohammed, Suhad M. Kadhem
This work is licensed under a Creative Commons Attribution 4.0 International License.