Have Scientists Found a Way to Pop the Filter Bubble?

They say the key to exposing us to opposing views is to get them from people with whom we share other interests

Filter-Bubble.jpg
Personalized search keeps people from escaping their worldview bubble. Image courtesy of Eli Pariser

We like to believe that every visit to Google is a search for knowledge, or, at least, useful information. Sure, but it's also an act of narcissism.

Each time we retrieve search results, we pull out a virtual mirror that reflects who we are in the Web world. It's what Eli Pariser aptly described as the "filter bubble" in his 2011 book, The Filter Bubble: What the Internet Is Hiding From You.

Pariser laid out the thinking behind algorithmic personalization. By meticulously tracking our every click, Google--and now Facebook and more and more other websites--can, based on past behavior, make pretty good guesses about what we want to know. This means that two people doing exactly the same search can end up with very different results.

We're fed what we seem to want, and since we're more likely to click on stuff within our comfort zone--including ads--Google, and others, are motivated to keep sharpening their targeting. As a result, the bubbles we live in are shrinking.

There's a price for all this precision, as Pariser pointed out in an interview with Brain Pickings' Maria Popova:

"Personalization is sort of privacy turned inside out: it’s not the problem of controlling what the world knows about you, it’s the problem of what you get to see of the world."

The bigger picture

So we're trapped in a maze of our own making, right?

Not necessarily, thanks to a team of scientists who say they may have come up with a way to escape the constraints of algorithms. As the MIT Technology Review reported recently, Eduardo Graells-Garrido at the Universitat Pompeu Fabra in Barcelona and Mounia Lalmas and Daniel Quercia at Yahoo Labs have developed what they call a "recommendation engine," designed to expose people to opposing views.

One key, say the researchers, is that those views come from people with whom we share other interests. That seems to make us more receptive to opinions we'd otherwise likely dismiss as folly. The other is to present opposing views in a visual way that makes them feel less foreign.

To that end, the scientists used the model of a word cloud, which allowed study participants both to see what subjects they tended to tweet about most often, and also to have access to--in a visually engaging way--content from others whose own word clouds mentioned many of the same topics.

But what if some of that content reflected a very different political view? Would people instinctively reject it?

To put their theory to a proper test, the researchers connected people on opposite sides of an issue that evokes deeply personal feelings--abortion. They focused on thousands of active Twitter users in Chile who had included hashtags such as #prolife and #prochoice in their tweets, creating word clouds for them based on terms they used most frequently.

Then, they provided study participants with tweets from people who had many of the same terms in their word clouds, but who also held the opposite view on abortion. The researchers found that because people seemed to feel a connection to those who had similar word clouds, they were more interested in their comments. And that tended to expose them to a much wider range of opinions and ideas than they would have otherwise experienced.

In short, the researchers used what people had in common to make them more open to discussing ways in which they differed. They had, their paper concluded, found "an indirect way to connect dissimilar people."

So, there's hope yet.

Madness to the method

Here are other recent developments in the sometimes bizarre world of algorithms.

  • Nothing like automated "Warm personal regards": This was probably inevitable. Google has just received a patent for software that would keep such close track of your social media behavior that it will be able to provide you with a choice of possible reactions to whatever comments or queries come your way on Facebook or Twitter. If, for instance, a friend gets a new job, the software would suggest a response, presumably something such as "Congratulations." That's right, you wouldn't have to waste any of your brain power. The algorithm will do it for you.

  • Phone it in: Researchers at the University of Helsinki have developed algorithms for determining how people get around--walking, driving or taking the bus or subway--by tracking the accelerometer signals of their cell phones. That allows them to analyze the frequency of their stops and starts. The researchers say it could be a powerful tool in helping planners understand how people move around in their cities.

  • All the news that fits: Facebook has tweaked its "news feed" algorithms so that more actual news will start showing up there. The idea is to give greater exposure to links to articles from news organizations on Facebook feeds--which will help make the social media giant more relevant to what's going on in the world besides friends' birthdays. The speculation is that this is an effort by Facebook to challenge Twitter's dominance in generating buzz around current events.

  • What does she have to say about the Chicago Cubs?: An Israeli computer scientist has created an algorithm that can analyze huge volumes of electronic data about past events from sources as diverse as the New York Times' archive to Twitter feeds and predict what might happen in the future. Most notably, the scientist, named Kira Radinsky, has used her system to predict the first cholera epidemic in Cuba in many decades and the protests leading up to the Arab Spring.

Video bonus: Here's the TED talk that made Eli Pariser and his concept of the filter bubble famous.

Video bonus bonus: There are algorithms for everything these days and, to believe Sheldon, of "The "Big Bang Theory," that includes making friends.

More from Smithsonian.com

How Big Data Has Changed Dating

Think You're Doing a Good Job? Not If the Algorithms Say You're Not

Get the latest stories in your inbox every weekday.