Lex Arriola is your typical 15-year-old girl. She uses her smartphone a ton. She texts. She FaceTimes. Like most teens, she loves emoji and, of course, Taylor Swift.
But unlike most of her peers, Arriola was born blind. When she gets texts with emojis, Siri translates them, so messages are punctuated by “face screaming in fear” (😱) and “puffing with angry face” (😤). She has a Braille Sense, a small book-sized beige contraption with a tactile keyboard that she uses to read and write. Arriola, a petite curly-haired brunette with a warm smile, commands her always-dark iPhone screen with a flurry of taps, swipes and voice commands to Siri.
Her web experience is mostly good, she says, though she steers clear of Facebook and Snapchat because they’re so picture-heavy. But what she really wants is more autonomy in the real world. She got a glimpse of what technology will make possible when Google’s self-driving car stopped by an event she recently attended. The search giant stationed one of its robocars in the parking lot so kids and parents could take a look at the future. Arriola didn’t get to ride it, but she was excited about it nonetheless, and hopes to one day buy one.
“Think about it, it’s less asking people to drive you around,” she said. “It’s more independence.”
Recent advancements in artificial intelligence, along with the proliferation of sensors, mean a technological revolution is coming for people with vision loss. Universities and companies like IBM, Microsoft and Baidu are working on technologies ranging from smart glasses to better computer-vision software that could one day serve as digital eyes for the estimated 285 million visually impaired people worldwide.
Over the last 30 years, technologists have made huge strides in making the internet more accessible to the blind, with digitized Braille systems and text-to-speech software that reads the words on a webpage or app aloud. More recently, companies, including Facebook, have started translating images into read-aloud text. (Maybe that will actually nudge Arriola to start using the social network.)
But “real life, compared to the cyberworld…has been a great challenge”, said Chieko Asakawa, a researcher at IBM and a professor at Carnegie Mellon University’s Robotics Institute. “The first step is to find out where you are,” she told me. “If a computer knows were I am, a computer can help a lot.”
Technology that’s currently being developed for non-humans is going to be a boon for the visually impaired. Now more than ever, tech companies need computerized machines to know exactly where they are, whether they’re working in a factory or cruising down city roads. The blind may benefit most from the tech world’s obsession with self-driving cars, not only in being able to use them to get around, but because the artificial intelligence developed for cars—to help them see and navigate streets—will likely be repurposed as assistive technologies.
To read the rest of this article, published in Fusion, please click here.