The relationship between humans and robots is a tricky thing. If the latter looks too much like the former, but is still clearly a machine, people think it’s creepy, even repulsive—a feeling that’s become known as the “uncanny valley.”
Or, as is sometimes the case, the human, with “Star Wars” or “The Jetsons” as his or her reference points, is disappointed by all the things the robot can’t yet do. Then, there is the matter of job insecurity—the fear of one day being replaced by a tireless, unflappable, unfailingly consistent device.
Human-robot interactions can be even more complicated for one group in particular—older adults. Many are not that comfortable with new technology, even less so if they feel it’s invading their privacy or a constant reminder of their own slipping cognitive skills.
And yet, it’s widely believed that with the first surge of Baby Boomers hitting their 70s—with a huge wave to follow—technology in some form will play a growing role in enabling older adults to live in their homes longer.
But will it be robot companions? Talking digital assistants? Strategically-placed sensors? Or maybe some combination of devices? And, what unexpected impact could they have on how people age and whether they stay connected to family and friends.
“You have to walk this balance on where you are starting to impinge on somebody’s privacy versus tracking their safety and social engagement,” says David Lindeman, co-director of Health Care at the Center for Information Technology Research in the Interest of Society (CITRIS) at the University of California, Berkeley. “That’s the compelling challenge of the next decade. How do we maximize the use of this technology without having unintended consequences.”
The right moves
For the past month or so, a small group of older adults in San Francisco has been learning to engage with a talking device named ElliQ. It’s more desk lamp than archetypal robot—think of the hopping light at the beginning of Pixar movies. But while ElliQ is meant to sit on a table or nightstand, it’s all about movement, or more accurately, body language.
Like Siri or Amazon’s Alexa, ElliQ talks. But it also moves, leaning toward the person with whom it’s speaking. It lights up, too, as another means of engagement, and uses volume and sound effects to distinguish its messages.
“If ElliQ is shy, she will look down and talk softly, and her lights will be soft,” explains Dor Skuler, CEO and founder of Intuition Robotics, the Israeli company behind the device. “If she tries to get you to go for a walk, she will lean forward and take a more aggressive tone, and her lights will be bright.
“Most of the way we communicate as humans is non-verbal,” Skuler adds. “It’s our body language, our use of silence and tone, [and] the way we hold ourselves. But when it comes to working with a computer, we’ve adapted to the technology instead of the other way around. We felt that a machine having a physical presence, versus a digital presence, would go a long way in having what we call natural communication.”
Skuler described a typical interaction. The grandchildren of an ElliQ owner send her photos through a chatbot using Facebook Messenger. When ElliQ sees new pictures have come in, it tells the grandmother and asks if she wants to look at them. If she says yes, ElliQ brings them up on its separate screen component. As the woman looks at the photos, so does ElliQ, tilting its “head” toward the screen, and turning the moment into more of a shared experience. With the help of its image recognition software, it might add, “Aren’t those girls cute?”
“It’s not the same as your adult child coming over to you and showing you photos of your grandchildren on her phone,” says Skuler. “But it’s also very different from you just looking at the photos on a screen by yourself. You weren’t with another person, but you weren’t really alone, either. We call that an in-between stage.
“What we like about this,” he adds, “is that without the family sending the content, there is no content. ElliQ isn’t there to replace the family. I don’t think we want to live in a world where people have meaningful relationships with machines. What it can do, though, is make that content more accessible and allow you to share the experience.”
Not too cutesy
A lot of research went into how ElliQ looks and behaves, says Yves Béhar, founder of fuseproject, the Swiss industrial design firm that worked with Intuition Robotics on the project. That included getting input from experts on aging. (“Our first hire was a gerontologist,” says Skuler.)
“One of the key premises behind ElliQ is that technology is complicated and perhaps too complex for aging people to use,” Béhar says. “But artificial intelligence (AI) can be used to engage with a person in a much simpler way. It can remind a person to take their meds, or connect with their family, or just tell them, ‘Hey, why not go outside. It’s nice out.’
“And we felt that EllliQ should be a table object, rather than a creepy robot that follows you around,” he adds. “By keeping it in one room, a person can interact with it like they would a familiar appliance in a familiar context.”
There was another important consideration, notes Behar. It had to look appropriate. “We didn’t want it to look childish or cartoonish,” he says. “We didn’t feel that was right. We wanted it to be friendly, but not too cutesy in a way that diminished the intelligence of the user.”
It’s also critical that ElliQ keeps learning. As Skuler explains it, one of the first steps in establishing a relationship with this particular robot is to set some goals, such as how many times a week a person wants to go out for a walk or be reminded to see friends. Then, it’s up to ElliQ to determine the most effective way to do its job. In other words, it will learn that one person responds better to “It’s nice out, why don’t you go for a walk,” while another needs to be prodded more aggressively with “You’ve been on the couch watching TV for four hours. Time to get up and take a walk.”
“That’s where the emotive side kicks in,” he says. “ElliQ can set a whole different tone, and use different body language and gestures based on what works and what doesn’t work. The machine fine-tunes itself.”
While he describes ElliQ as a “good listener,” Behar sees the device more as a coach than a companion. He acknowledged the risk of making machines too engaging, and thereby encouraging more social isolation, not less.
“We don’t want to create the kind of emotional dependency that social media sometimes does,” he says. “We need to make sure it complements their human relationships. It’s very important that we keep that in mind as we develop these interactions between humans and machines with artificial intelligence.”
An underlying question is how big a role machines will need to play in caring for older adults in the coming decades. AARP has predicted a “caregiving cliff,” estimating that by 2030, there will be only four family caregivers available for every person needing care, and that that will drop to three caregivers by mid-century.
Technology is certainly expected to help fill the void, but to what degree? Richard Adler, a research associate at the Institute for the Future in Palo Alto, has been studying the nexus of technology and aging more than 25 years, and he agrees that the key is finding the proper role for machines.
“There’s always been this interesting paradox at the heart of it,” he says. “Of all the age groups, older adults stand to benefit the most from technology. In a lot of ways. But it also is the group that has the lowest level of adoption.”
He’s encouraged by the recent big leaps in voice recognition by machines because it allows older people to use technology without having to mess with smartphones or typing on small keyboards. “But that’s a long way from saying this is a tool that combats social isolation,” he adds. “Because that would involve a lot more than a machine with a nice voice.”
And, for all the possible benefits of artificial intelligence in helping older adults stay on top of their health needs and social activities, Adler is wary of machines taking too much control. “There’s the discussion of AI versus IA—intelligence augmented—where machines extend human capabilities instead of replacing them.”
That tension between what technology can now do and how much older people actually use it is at the heart of what’s become known as “connected aging”—the use of machines, from smartphones to sensors to wearable devices, that can enable adults to grow old in their own homes. David Lindeman, who is also director of the Center for Technology and Aging in California, has been studying how older adults interact with machines for a long time, and while he points out that researchers are still in the early stages of understanding how technology can affect social isolation, he sees a lot of potential.
“I think it’s better to err on the side of let’s get people engaged and see what works,” he says. “There are such deficits in terms of social engagement for a lot of people.” He points to software that makes it easier for older adults to share stories from their past, and the use of virtual reality to help them feel less isolated.
Lindeman also says sensors and other devices are making it easier to track the movements of older adults and determine if they’ve fallen or may need help. “If you capture enough information, you’ll be able to tell if a fall is serious or not,” he notes, “and you’d be better able to help people keep from going to the emergency room unnecessarily.”
Learning from sensors
In fact, researchers can learn quite a bit about a person’s behavior without the benefit of a talking robot, according to Diane Cook, a researcher at Washington State University’s Center for Advanced Studies in Adaptive Systems, who has spent the past decade studying how older adults live. She’s done it without cameras or microphones—and the privacy concerns they raise. She and her team only use sensors.
Not only are the devices able to keep track of safety matters, such as whether the stove was left on or water was running, but they have been able to gather lots of enlightening data on how people moved around their homes.
“Motion sensors are the bread and butter because they can point to where a person is in the home,” she says. “Location alone doesn’t let you know what’s going on, but when you combine it with the time of day, what happened right before, and where they were, then you can start to see patterns that make a lot of sense.”
The research has been augmented by assessments of the subjects’ mental and physical health every six months, and it turned out that Cook and her team were able to predict “with promising results” how people would score on the tests, based on what the sensor data showed about their behavior. Changes in sleep patterns, for instance, were found to be correlated with changes in cognitive health.
“The greatest predictive performance, however, was achieved when we considered a very large number of features describing all detected activity patterns,” she says. ”There is no one silver bullet behavior that indicates a change in health. The person needs to be looked at holistically.”
The goal ultimately is to develop algorithms that will be able to predict what behavior changes mean, so steps can be taken to prevent health problems or emotional issues from worsening.
That, say longtime researchers like Lindeman, could be one of the greater benefits of technology when it comes to addressing the needs of an aging society. “We will be able to identify when people have differences in their gait, differences in their affect, differences in their interactions and the way they communicate, and that could help us pick up signs of depression and dementia much earlier.”
It all sounds very promising, and clearly timely. But as with any technology, the rapid advances in AI and robotics can jump ahead of comprehending their impact. Guy Hoffman certainly understands this. Hoffman is a leading expert on human-robot interactions. A TED talk he did a few years ago, in which he showed robots improvising music, has been viewed almost 3 million times.
Now a researcher and assistant professor at Cornell University, Hoffman served as an advisor on the ElliQ project. Lately, he’s been focused on seeing if robots can learn to be more responsive to humans. Recently, he was involved in a study led by researcher Gurit Bimbaum, in which a group of test subjects were asked to share with a robot a difficult situation they had experienced. Half of the people interacted with a device Hoffman describes as “more robotic”—it responded to the stories with a detached attitude, offering an occasional “Thank you. Continue.” In some cases, Hoffman notes, they even had the robot look at a cell phone while the person was talking.
But the other half of the research subjects talked to a robot designed to feel more empathetic. It leaned forward as the person talked, and when there was a pause in the story, might make a comment like, “That sounds like that was a really difficult thing to go through.”
The study produced some fascinating results. “We found that people felt those robots seem to care more about them,” he says. And, when the subjects were asked to follow up the storytelling with a stressful task—recording a video for a dating site—some asked if the “empathetic” robot could be there with them.
“They said that actually made them feel more confident about themselves,” Hoffman notes. “People said they felt more attractive after they had received more of the responsive behavior.”
Hoffman admits to having mixed feelings about it. “There’s a bright side and a dark side to all of this. Do we want people to talk to robots when they feel lonely? Are we solving a problem or are we making it worse? Those are the questions we need to ask.
“I always remind people that success should not be measured by technological success, but by societal success. There is a distinction that needs to be made between what is possible to do and what is desirable to do with robotics.
“That is really one of the most important conversations we need to have about technology today,” Hoffman says. “In one way, technology is increasingly successful in addressing our social needs. But we do not want the technology to drive our human values. We want our human values to drive our technology.”
This article was written with the support of a journalism fellowship from New America Media, the Gerontological Society of America and AARP.