“Superpower Glass” Helps Kids With Autism Understand Emotions
A new Stanford-designed technology pairs Google Glass with a face-identifying AI app that tells wearers what emotions they’re seeing
People with autism often struggle with understanding what others are thinking or feeling. Decoding facial expressions can be especially tricky. Is that smile a genuine grin of delight, or a tight grimace of politeness? Does that wrinkled brow mean anger, or just concentration? When you can’t understand the messages on other people’s faces, it’s hard to engage socially. Children with autism are therefore often left out of the group interactions so critical to development.
Now, Stanford researchers say they have a possible new aid: Google Glass. They’ve combined the augmented reality glasses with an app that uses artificial intelligence to identify faces and facial expressions in a child’s field of vision, then shows the child an emoji of the correct expression.
“Children with autism unanimously struggle to engage their social world,” says Dennis Wall, a professor of pediatrics and biomedical data science at the Stanford University School of Medicine, who led the research. “They don’t make face contact, and they don’t really understand the emotional differences that are exhibited in the faces. These are the two primary deficits that a lot of the behavioral therapy focuses on today.”
Intensive behavioral therapy—working one-on-one with a therapist, teacher or caregiver, to increase skills, including social skills—is helpful for many kids with autism. The problem is, Wall explains, that there are too many kids with autism and too few trained therapy providers, leaving many children languishing on wait lists. The earlier the intervention, the more successful it’s likely to be. But many children can’t get into early intervention therapy, which would ideally start as young as early toddlerhood, because of wait lists, lack of insurance or a late age of diagnosis.
Wall hopes the new technology, which his team has nicknamed "Superpower Glass," could help bridge the gap between diagnosis and beginning treatment.
“We’ve worked very hard to build a mobilized system that can go to the home and generalize to the child’s natural environment, Wall says.
The glasses work with an outward-facing camera, which snaps images and passes them to a phone app via wifi. The app uses machine learning to identify faces and decide on their expression. The glasses then show a green box that lights up on the periphery of the child’s vision to say “you found a face.” An instant later, an emoji with the correct facial expression pops up. The glasses also have a microphone that can give the child verbal information, but Wall and his team are finding that kids prefer the emojis.
The app has three modes. The “free play” mode has children simply wear the glasses while interacting in their normal environments, capturing faces and emotions wherever they appear. Then there are two game modes: “guess my emotion,” where a parent acts out an expression, and “capture the smile,” where the kids give another person clues about an emotion they’re thinking of until the other person successfully acts it out.
At present, the app identifies eight core facial expressions: happiness, sadness, anger, disgust, surprise, fear, contempt and neutral.
“Learning these fundamental emotions unlocks or really peels back a layer of the developmental onion, so to speak, enabling [kids] to gain the confidence necessary to grow on their own in more complex social scenarios,” Wall says. “If they miss these eight early on, it’s very hard for them to learn later, and it’s even harder for them to learn more subtle social nuances like ‘interested’ or ‘disinterested.’”
In the study, 14 children between 3 and 17 tested the glasses at home, using them for at least three 20-minute sessions a week for an average of 10 weeks each. Their parents completed surveys about the children’s social skills at the beginning and end of the study.
Twelve of 14 families said their children made more eye contact at the end of the study. The children’s average scores on parent-completed questionnaires of their social skills decreased by 7.38 points during the study, meaning some autism symptoms had decreased. Six of the 14 participants had large enough declines in their scores to move down a step in severity in their autism classification, for example from “severe” to “moderate” or “mild" to "normal.” The gains remained in place weeks after the study had ended, suggesting the glasses could potentially be used as a temporary “crutch.”
The findings were described earlier this month in the journal npj Digital Medicine.
Andrea Ruppar, a professor of rehabilitation psychology and special education at the University of Wisconsin-Madison, says she sees the promise in the Superpower Glasses.
“It seems that the technology would allow the person to review a real-life example of an emotional expression from a person with whom they interact often,” she says. “They would have many examples, which is essential for transferring the skill to other real-life contexts.”
Ruppar says people with autism have long been using technology to learn and connect. When she was a classroom teacher for kids with autism 20 years ago, she had students who would watch VHS tapes over and over to memorize lines of dialogue and then figure out how to use them in real life.
“I hope that as we advance the learning technology for students with autism, we keep people with autism in the driver’s seat,” Ruppar says. “The best technological solutions will come from listening to people with autism—not only those who use speech, but also those who require technology to communicate.”
The Stanford study was not controlled; the team has already completed a randomized controlled trial and is writing up their findings. The results, Wall says, are promising, and similar to the pilot study. The team now hopes to find an industry partner to produce the glasses on a larger scale. They also hope to eventually get the technology approved by the FDA as a medical device, which would mean insurance companies would pay for it. They also hope to develop the app to offer a wider range of feedback beyond the eight core facial expressions, making the glasses useful to more children, and even adults.
“If I had to choose best place to position this it’s definitely in these younger kids, particularly if they’re on these waiting lists,” Wall says. “They need therapy but can’t get it. This is a great bridge.”