Dallas university team develops sign language recognition system using TI parts, partly TI funded

A team of Dallas researchers has developed a wearable American Sign Language Recognition system to detect gestures in real-time that looks like something from a sci-fi movie.

The system consists of two devices: One that sits on a person’s wrist to measure hand motion and one that rests on the arm to measure muscle activity. Once sensors capture the movements, the information is sent wirelessly to a computer to be translated into text or speech.

It’s part of the huge and growing wearable technology trend. The global wearables market, which includes gear from hearing aids to smart glasses and fitness trackers, could top $50 billion in 2018, up from about $10 billion in 2013, according to research firm IHS.

Roozbeh Jafari, an electrical engineering professor until recently at the University of Texas at Dallas, came up with the idea. He recruited electrical engineering Ph.D. student Jian Wu and two computer engineering graduate students Lu Sun and Zhongjun Tian to build the system in his Embedded Systems and Signal Processing Lab. (Sun just graduated.)

Although the sign language system still is in the prototype stage, it can recognize about 40 American Sign Language words and has a 96 percent accuracy rate, Jafari said.

Now, people use cameras to recognize and translate sign language into speech or text, Wu, 28, said. The advantage of the wearable technology is that it’s low cost, it’s comfortable, it doesn’t invade the user’s privacy and you can wear it anywhere, anyplace and anytime, he said.

Sun, Tian and Wu — all originally from China — won second place and $7,500 in the TI Innovation Challenge this summer. They also presented a paper on the system at the Body Sensor Networks Conference in Boston in June.

“I’m quite proud of what they’ve done, ” Jafari said.

To read the rest of this article, published in the Dallas Morning News, please click here