Indian engineering student Priyanjali Gupta has developed a one-of-a-kind artificial intelligence model that is capable of translating American sign language into English in real-time.
Priyanjali is a third-year computer science student of Tamil Nadu’s renowned institute named Vellore Institute of Technology (VIT).
According to Priyanjali, her newly developed artificial intelligence-powered model was inspired by data scientist Nicholas Renotte’s video on Real-Time Sign language Detection. She developed this new AI model using Tensorflow object detection API that can translate hand signs using transfer learning from a pre-trained model named ssd_mobilenet.
Read More: Sam Altman Invites Meta AI researchers to join OpenAI
Priyanjali said, “The dataset is manually made with a computer webcam and given annotations. The model, for now, is trained on single frames. To detect videos, the model has to be trained on multiple frames, for which I’m likely to use LSTM. I’m currently researching on it.”
She further added that it is pretty challenging to build a deep learning model dedicated to sign language detection, and she believes that the open-source community, which is much more experienced than her, will find a solution soon.
Additionally, she mentioned that it might be possible in the future to build deep learning models solely for sign languages. She said in her Github post that the dataset was developed running the Image Collection Python file that collects images from webcams for multiple signs in the American Sign Language, including Hello, I Love You, Thank you, Please, Yes, and No.
Despite the fact that ASL is the third most widely spoken language in the United States, not much effort has been made to create translation tools for the language. This newly developed AI model is a big step towards creating transition models for sign languages.
Priyanjali said in an interview, “She (mother) taunted me. But it made me contemplate what I could do with my knowledge and skill set. One fine day, amid conversations with Alexa, the idea of inclusive technology struck me. That triggered a set of plans.”