Augmenting Social Cues for the Disabled

For the first time in 12 years, Bryan Duarte is sensing a smile. The 30 year-old software engineering student became blind at age 18 after a motorcycle accident. He spent years after the accident learning to navigate the world in darkness and figuring out how to communicate in a culture that largely relies on visual cues.

“I was very good at reading people’s body language, their nonverbal language,” Duarte says. “To not have access to that information now, it makes things pretty difficult in social situations… without those social cues, you don’t know if someone is looking at you, if you have their attention. You don’t know if you maybe offended them. You don’t know if they’re smiling, if they’re engaged.”

Now, for the first time since the accident, Duarte is getting those nonverbal cues. In Arizona State University’s Center for Cognitive Ubiquitous Computing, Duarte is sitting in what appears to be a typical black desk chair. It has a USB port built into the back where users can plug in a computer with a webcam. Sitting across from Duarte, with a webcam aimed at his face, is Troy McDaniel, associate director for CUbiC. During conversation, a built-in facial recognition system reads whether McDaniel looks happy, surprised, or neutral and communicates that information back to Bryan through specific vibration patterns emitted from the chair’s lining.

These patterns are designed to mimic the way our mouths convey glee, boredom or despair. When McDaniel adopts a neutral gaze—straight-lipped, no smile or frown—Duarte feels a strip of pancake motors in the chair vibrate in a straight line, starting from the right of his lower back and moving to the left. When McDaniel looks happy, Duarte feels a U-shaped vibration that mimics the way lips curve upwards when they smile. This demonstration of ASU’s “Haptic Chair” is the first time in more than a decade that Bryan has been able to get a sense for how his conversation partner is reacting without directly asking. During the 45-minute demonstration that I watch via Skype, Duarte excitedly tells me when he can feel McDaniel’s body language shift.

“He’s smiling right now. I can tell you that,” Duarte says. “When he was talking, I was getting kind of a mix between smiles and surprised looks, which is good because he might not have been exactly smiling, he might not have been exactly surprised, but to know he had emotion going on throughout that conversation is in a lot of ways the important aspect… As a blind guy, it’s really cool to be able to sit across from somebody and know even when they’re quiet what they’re doing.”

Sensing Social Cues

The Haptic Chair is currently a proof of concept and is a long way from being ready for the commercial market. As of now, the chair’s facial recognition system doesn’t work when speakers wear glasses and it’s not capable of identifying negative emotions like sadness and fear which generally aren’t as visually distinguishable as happiness or surprise. McDaniel’s team is planning a series of user studies where they hope to tweak the system to better accommodate blind users.

To read the rest of this article, published on PBS.org, please click here.