Student Develops AI Model for Translating Sign Language - Creative Word

A student from Tamil Nadu India, has developed an AI model which achieves real-time translation of sign language.

This is an amazing accomplishment which even big-tech companies have failed to achieve to date.

According to an article online by the United Business Journal, the third-year engineering student, Priyanjali Gupta, has created an AI model which translates American Sign Language (ASL) into English, instantly.

Priyanjali shared her project on LinkedIn, writing “Made this Model using this cool TensorFlow object Detection API the model translates a few ASL signs to English using Transfer learning from pretrained ssd_mobilenet model.”

She claims the model can translate simple ASL codes such as, ‘hello’, ‘goodbye’, and ‘thank you’, and the post even has a video showing the application being used and working successfully.

In an interview with Priyanjali, she stated that “the data-set for the AI model uses Python Library aimed at real-time computer vision which takes images from the Webcam.”

She also dedicated the project to Nicholas Renotte, a Senior Data Science and AI Specialist at IBM whose tutorials on real-time sign language helped her when creating the AI model.

Responding to one of the comments on her LinkedIn post she admitted that while developing a machine learning model for sign language detection was a difficult task, she believes that it is not unfeasible to assume that, sooner or later, the worldwide community of machine learning developers will develop a deep learning model solely for sign language purposes.

This can only be good news for the 70million deaf people around the world and those that rely on sign language to communicate on a daily basis, with the hope that, an AI model will be created which is capable of translating every sign language, but the issue is complex as each language has its own variation of signs relating to the specific language.

However, this isn’t the first sign language translation project.

In 2018 the Centre for Vision Speech and Signal Processing (CVSSP) at the University of Surrey began work on a project which aimed to translate British Sign Language (BSL) into written words.

The project, worth around £1 million, aimed to develop a machine that recognises hand motion, form, facial expressions, and body language of the person signing and then processes the information into potential phrases, which are translated into either written, or spoken English.

AI technology has the potential to connect people around the world by aiding communication between different languages, cultures, and disabilities.

Here, at Creative Word, we can’t wait to witness the evolution of these AI models and look forward to their mainstream use.

If you’d like to share your opinion regarding this exciting development, please leave your message in the comments section below.