ASU student develops sensing technology for visually, hearing impaired

A visually-impaired man can’t see the interviewer sitting across from him but knows that the questioner has a smile on his face.

Motors in the visually-impaired person’s chair give the sensation of having a shape – a smile – drawn on his back.

“We use that as a way to communicate the face,” said Shantanu Bala, an Arizona State University graduate.

Vision through chair massage.

Bala, who developed the new technology while at ASU, is helping people with visual and hearing disabilities to know what they can’t sense.

Here’s how it works: the visually impaired person sits in a chair with different motors on it. A camera captures the facial movements of the interviewer, and that information is translated into sensations on the person’s back, allowing them to feel the movement of somebody’s face.

“It’s for the most part hidden,” Bala said. “You might be able to hear a small buzz of different motors. One of the goals we had was to make it as discreet as possible.”

Bala, who double majored in psychology and computer science, has worked with ASU’s Center for Cognitive Ubiquitous Computing (CUbiC) for six years on projects that help provide people with hearing or visual disabilities with situational context.

The first device he worked on involved a glove that would provide different sensations on a person’s hand depending on what the computer camera picked up from the speaker.

“If another person is smiling, you actually kind of feel a smile on the back of your hand,” Bala said, drawing a smile on his hand. “That’s a way to take emotions and turn them into something a person with a visual disability can interpret as they’re talking to someone.”

To read the rest of this article, published in ASU News, please click here.