One Day Your Phone Will Know If You’re Happy or Sad

By analyzing every tiny facial gesture, voice inflection or even how quickly we tap out a text message, devices are getting good at reading our emotions

Facial analysis at work. Image courtesy of Affectiva

As much time as we spend with our cell phones and laptops and tablets, it’s still pretty much a one-way relationship. We act, they respond. Sure, you can carry on a conversation with Siri on your iPhone, and while she is quick, it hardly qualifies as playful bantering. You ask questions, she gives answers.

But what if these devices could really read our emotions? What if they could interpret every little gesture, every facial cue so that they can gauge our feelings as well as–maybe better than–our best friends? And then they respond, not with information, but what might pass for empathy.

We’re not there yet, but we’re quickly moving in that direction, driven by a field of science known as affective computing. It’s built around software that can measure, interpret and react to human feelings. This might involve capturing your face on camera and then applying algorithms to every aspect of your expressions to try to make sense of each smirk and chin rub. Or it might involve reading your level of annoyance or pleasure by tracking how fast or with how much force you tap out a text or whether you use emoticons. And if you seem too agitated–or drunk–you could get a message suggesting that you might want to hold off pressing the send icon.

Seeing how difficult it is for us humans to make sense of other humans, this notion of programming machines to read our feelings is no small challenge. But it’s picking up speed, as scientists sharpen their focus on teaching devices emotional intelligence.

Every move you make

One of the better examples of how affective computing can work is the approach of a company called, appropriately, Affectiva. It records expressions and then, using proprietary algorithms, scrutinizes facial cues, tapping into a database of almost 300 million frames of elements of human faces. The software has been refined to the point where it can associate various combinations of those elements with different emotions.

When it was developed at M.I.T’s Media Lab by two scientists, Rosalind Picard and Rana el Kaliouby, the software, known as Affdex, was designed with the purpose of helping autistic children communicate better. But it clearly had loads of potential in the business world, and so M.I.T. spun the project off into a private company. It has since raised $21 million from investors.

So how is Affdex being used? Most often, it’s watching people watching commercials. it records people as they view ads on their computers–don’t worry, you need to opt in for this–and then, based on its database of facial cues, evaluates how the viewers feel about what they’ve seen. And the software doesn’t provide just an overall positive or negative verdict; it breaks down the viewers’ reactions second by second, which enables advertisers to identify, with more precision than ever before, what works in a commercial and what doesn’t.

It also is able to see that while people say one thing, their faces can say another. During an interview with the Huffington Post, el Kaliouby gave the example of the response to an ad for body lotion that aired in India. During the commercial, a husband playfully touches his wife’s exposed stomach. Afterwards, a number of women who had watched it said they found that scene offensive. But, according to el Kaliouby, the videos of the viewers showed that every one of the women responded to the scene with what she called an “enjoyment smile.”

She sees opportunities beyond the world of advertising. Smart TVs could be that much smarter about what kind of programs we like if they’re able to develop a memory bank of our facial expressions. And politicians would be able to get real-time reactions to each line they utter during a debate and be able to adapt their messages on the fly. Plus, says el Kaliouby, there could be health applications. She says it’s possible to read a person’s heart rate with a webcam by analyzing the blood flow in his or her face.

“Imagine having a camera on all the time monitoring your heart rate,” she told the Huffington Post, “so that it can tell you if something’s wrong, if you need to get more fit, or if you’re furrowing your brow all the time and need to relax.”

So what do you think, creepy or cool?

Tracking devices

Here are five other ways machines are reacting to human emotions:

  • And how was my day?: Researchers at the University of Cambridge have developed an Android mobile app that monitors a person’s behavior throughout the day, using incoming calls and texts, plus social media posts to track their mood. The app, called “Emotion Sense,” is designed to create a “journey of discovery,” allowing users to have a digital record of the peaks and valleys of their daily lives. The data can be stored and used for therapy sessions.
  • And this is me after the third cup of coffee: Then there’s Xpression, another mood-tracking app created by a British company called EI Technologies. Instead of relying on people in therapy to keep diaries of their mood shifts, the app listens for changes in a person’s voice to determine if they are in one of five emotional states: calm, happy, sad, angry or anxious/frightened. It then keeps a list of a person’s moods and when they change. And, if the person desires, this record can automatically be sent to a therapist at the end of every day.
  • What if you just hate typing on a phone? : Scientists at Samsung are working on software that will gauge your frame of mind by how you type out your tweets on your smartphone. By analyzing how fast you type, how much the phone shakes, how often you backspace mistakes, and how many emoticons you use, the phone should be able to determine if you’re angry, surprised, happy, sad, fearful, or disgusted. And based on what conclusion it draws, it could include with your tweet the appropriate emoticon to tip off your followers to your state of mind.
  • Just don’t invite your friends over to watch: Using a sensor worn on the wrist and a smartphone camera worn around the neck, researchers at M.I.T. have created a “lifelogging” system that collects images and data designed to show a person which events represented their emotional highs and lows. The system, called Inside-Out, includes a bio-sensor in a wristband that tracks heightened emotions through electrical charges in the skin while the smartphone tracks the person’s location and takes several photos a minute. Then, at the end of the day, the user can view their experiences, along with all the sensor data.
  • Your brow says you have issues: This probably was inevitable. Researchers at the University of Southern California have created a robotic therapist that not only is programmed to encourage patients with well-timed “Uh-huhs,” but also is expert, using motion sensors and voice analysis, at interpreting a patient’s every gesture and voice inflection during a therapy session.

Video bonus: Want to see how bizarre this trend of devices reading human emotions can get? Check out this promotion of Tailly, a mechanical tail that picks up your level of excitement by tracking your heart rate and then wags appropriately.

More from Smithsonian.com

This New Robot Has a Sense of Touch

Cooking With Robots

Get the latest stories in your inbox every weekday.