Summary:
The paper proposes a system to identify the letters of the alphabet which are represented in American sign language by static gestures. The letters J and Z were not able to be recognized due to their signs involving moving gestures. The recognition system collected data from an 18 sensor CyberGlove using Labview, loaded the data into a MatLab program to train a perceptron network, and then used a second MatLab program to match glove input with the most closely corresponding letter. Only one user was selected to train the neural network, resulting in an "up to 90%" accuracy rate. MatLab's default configuration to not run in real-time was cited as a main obstacle in development.
Discussion:
The paper admittedly "took a more narrow initial focus", so to the fact that only static gestures were recognized is understandable. Still, translating static poses of one user into spoken letters is very far from the long term goal of translating moving gestures into spoken English. The single result of 90% reported was not very illuminating.
Subscribe to:
Post Comments (Atom)
1 comment:
Hi
I am interested in doing sign language translation using matlab.can u help me with this.
Post a Comment