MonVoix - An Android Application for hearing impaired people

Rachana Kamat, Aishwarya Danoji, Aishwarya Dhage, Priya Puranik, Sharmila Sengupta

Abstract


Human communication is the most valuable foundation in developing a cooperative environment for sharing information and knowledge by interactive sessions. Normal individual vocalizes his views through shared intentions like facial expressions and hand gestures. People with acoustical disabilities are obligated to rely on interpreters for day-to-day conversations. Interpretation of various Sign Languages is important as it will shorten the social drift and act as an agent of communal integration. This paper proposes an Android Application for a coherent interpretation of Sign Languages. MonVoix- a French remark for my voice, would act as a boon for the deaf and the mute by completely eliminating the requisite of a human interpreter. This approach utilizes a user’s Smartphone camera to capture a series of hand gestures and convert the image file to the corresponding message and audio using image processing techniques and database emulation for identification of image.

 


Keywords


Sign Language, hand gestures, Android Application, Otsu Binarization, Image processing, FFT, Canny edge detection, histogram equalization, dilation.

Full Text:

PDF

References


ArunKatkere, Edward Hunter, Don Kuramura, Jennifer Schlenzig, Saied Moezzi, and Ramesh Jain,“Robogest: Telepresence using hand gestures”,Technical report, University of California, San Diego, Visual Computing Laboratory, Technical Report No. VCL-94-104, December 1994.

Hank Grant, Chuen-Ki Lai, “simulation modeling with artificial reality technology (smart): an integration of virtual reality simulation modeling” , Proceedings of the Winter Simulation Conference, 1998.

S. Liwicki and M. Everingham. Automatic recognition of finger spelled words in british sign language. In IEEE Workshop on CVPR for Human Communicative Behavior Analysis, 2009.

SuchinAdhan and ChuchartPintavirooj “Alphabetic Hand Sign Interpretation using Geometric Invariance” The Biomedical Engineering International Conference (BMEiCON-2014)

S. N. Omkar and M. Monisha. “Sign Language Recognition using Thinning algorithm”.ICTACT JOURNAL ON IMAGE AND VIDEO PROCESSING, AUGUST 2011, VOLUME: 02, ISSUE: 01.

J. Jones. (1991, May 10). Networks (2nd ed.)

[Online].Available:http://www.atm.com R. Kurdyumov, P. Ho, J. Ng. (2011, December16) Sign Language Classification Using WebcamImages Online]. Available:http://cs229.stanford.edu/.../KurdyumovHoNgSignLanguageClassificationUsi...

J.Rekha, J. Bhattacharya, S. Majumder, "Shape, texture and local movement hand gesture features for Indian Sign Language recognition ", 3rd International Conference on Trendz in Information Sciences and Computing (TISe), pp. 30 - 35, 2011.

KanchanDabre, SurekhaDholay. “Machine Learning Model for Sign Language Interpretation using Webcam Images”. 2014 International Conference on Circuits, Systems, Communication and Information Technology Applications (CSCITA)

M.K.Bhuyan and P.K.Bora, "A Frame Work of Hand Gesture Recognition with Applications to Sign Language ", Annual India Conference, IEEE, pp.I-6.




DOI: http://dx.doi.org/10.22385/jctecs.v8i0.123