It’s a common misconception that most hearing-impaired people can easily read lips. But while many are indeed practiced lip readers, only 30 to 40 percent of English can be understood through watching the mouth. Much of spoken English occurs without lip movement, while many sounds, such as ‘b’ and ‘p,’ look identical.
This leaves many hearing-impaired people at a loss when communicating with the hearing. A number of recent technological innovations attempt to address the issue, from devices that turn spoken language into text on a smartphone to speculative systems to allow deaf people to “hear” through their tongues. That's right—researchers from Colorado State University are developing an earpiece that translates sounds into electrical patterns that it then sends to a retainer.
Now, a company is hoping to help the hearing-impaired in a more seamless way. The Live-Time Closed Captioning System (LTCCS) instantly turns speech into scrolling text displaying on a tiny screen clipped to a pair of glasses. Currently in the proof of concept phase, LTCCS’s founders say it “restores the user's ability to engage in a naturally flowing conversation.”
LTCCS creator Daniil Frants was inspired to design the device when his guitar teacher asked him if he thought Google Glass might be able to somehow help him communicate with his hearing-impaired father.
“I started messing around with Google Glass, seeing if it could do some closed captioning function,” he says. “But after six months it became obvious that there was no way to do that effectively using Glass.”
So Frants decided to do it himself. He created a system built from existing or modified parts—a Raspberry Pi microcomputer, a voice recognition system and a display adapted from one he spotted on an online technology forum. The user wears a microphone, which is connected to the microcomputer. The microcomputer picks up sounds and translates them to text using the voice recognition software, then sends them up to the display in a pair of glasses.
By the way, Frants is 17. His VP of Frants Innovators, Inc., Ilan Pesselev, is 14. The rest of his team is 18 and under. Most of them attend the same Manhattan high school.
I asked Frants where he learned the skills to create the LTCCS, given he’s yet to go to college or graduate school. He explains that his father taught him some basic programming skills and he taught himself the rest.
“If I needed to learn something new, I’d Google a bunch of stuff,” he says.
While "Googling stuff" might not help the average person figure out such a complex system, Frants is not average. At 14, he was the youngest person to ever intern at the ultra-prestigious MIT Media Lab, which focuses on human-machine research (think "smart" prosthetics and intelligent machines). He's also worked on cyber art projects that have been displayed all over the world.
Frants and his team hope to have a proper prototype by summer 2016. Ultimately, the device will retail for $750.
In the short term, Frants, who recently appeared on "The Tonight Show Starring Jimmy Fallon," hopes to study computer science at MIT. In the longer term, he’d like to see his company, Frants Innovators, become a hub for new ideas.
“Like a Darwinism for ideas, where eventually some die off and what’s left is the best idea,” he says. “I hope the LTCCS is the first idea for that.”