Img/2012/6/2/lang-100.jpgWashington: A group of students have developed a prototype device that reads sign language and translates its motions into words that are capable of being heard.
Often, the communication barriers exist between those who can hear and those who cannot.
Even though, sign language has helped bridge such gaps, but many people are still not fluent in its motions and hand shapes
Now that a group of University of Houston students who have developed this prototype, the hearing impaired may soon have an easier time communicating with those who do not understand sign language.
The students in UH’s engineering technology and industrial design programs teamed up to develop the concept and prototype for ‘MyVoice’, a device that reads sign language and translates its motions into audible words.
‘MyVoice’, recently earned first place among student projects at the American Society of Engineering Education (ASEE) - Gulf Southwest Annual Conference.
The concept of ‘MyVoice’ focuses on a handheld tool with a built-in microphone, speaker, soundboard, video camera and monitor. It would be placed on a hard surface where it reads a user’s sign language motions.
Once ‘MyVoice’ processes the movements, it then translates sign language into space through an electronic voice.
Likewise, it would capture a person’s voice and can translate words into sign language, which is displayed on its monitor.
The industrial designers researched the application of ‘MyVoice’ by reaching out to the deaf community to understand the challenges linked with others not understanding sign language.
“The biggest difficulty was sampling together a databases of images of the sign languages. It involved 200-300 images per sign,” Seto said.
“The team was ecstatic when the prototype came together.”
Right from its conceptual stage, ‘MyVoice’ evolved into a prototype that could translate a single phrase “A good job, Cougars.”