How Fake News Breaks Your Brain

Short attention spans and a deluge of rapid-fire articles on social media form a recipe for fake news epidemics

J1H8CC.jpg
So much potentially misleading information, so little time. Ivan Chiosea / Alamy

"Pope Francis shocks world, endorses Donald Trump for president." “Clinton's assistant J. W. McGill is found dead.” “‘Tens of thousands’ of fraudulent Clinton votes found in Ohio warehouse.” These shocking news headlines of the past year all had one thing in common: They weren’t true. Not in the slightest. Each was manufactured, either out of malice or an attempt to cash in on advertising revenue, in an effort to deceive as many unwitting Internet readers as possible. They were, in other words, “fake news.”

Fake news, of course, is nothing new. In the past it took the form of pamphlets created to smear political enemies or sensationalist stories designed to “go viral” the old-fashioned way through newspaper sales. But the recent surge of false information enabled by our new social media landscapes has propelled it forward as a serious problem worthy of national and even international debate.

The problem, people say, is the medium. Which makes sense: Social media platforms like Facebook face criticism for enabling the spread of this kind of misleading or incorrect information, because they allow any user or even automated bots to post legitimate-looking articles, which then proceed to spread like wildfire through "liking" and "sharing." Now Facebook has rolled out new tools to crack down on fake viral articles, while Twitter is testing a new feature to let users flag misleading, false or harmful information.

But a new study published this week in the journal Nature Human Behaviour shows that the limitations of the human brain are also to blame. When people are overloaded with new information, they tend to rely on less-than-ideal coping mechanisms to distinguish good from bad, and end up privileging popularity over quality, the study suggests. It’s this lethal combination of data saturation and short, stretched attention spans that can enable fake news to spread so effectively.

"Through networks such as Twitter and Facebook, users are exposed daily to a large number of transmissible pieces of information that compete to attain success," says Diego Fregolente Mendes de Oliveira, a physicist at Northwestern University who studies how networks of people work and lead author of the study.

Because of the significant impacts that social media can have on politics and life, Oliveira says, discriminating between good and bad information has become "more important in today's online information networks than ever before." Yet even though the stakes are higher, the dynamics of like-minded groups such as those found on social media can undermine the collective judgment of those groups—making judgment calls about fake news even harder to make. As the study puts it, when given too much information, humans become “vulnerable to manipulation.”

In 2016, Oliveira set out to study how information spreads on social networks, and particularly how "low-quality information" or fake news can end up rippling out like a contagion. He designed a theoretical model to predict how fake news spreads on social networks.

The model did not incorporate actual human users or actual fake articles. But it did draw on data collected by independent observers about debunked (but nonetheless popular) Facebook and Twitter articles to calculate an average ratio of real news to fake news in posts flagged for review by users. Oliveira used this ratio to run an algorithm he designed on the sharing of news in a network.

This model was similar in design to a previous study in which Oliveira showed how people who segregate themselves into separate networks—the social bubbles of like-minded people one tends to create on Facebook, for example—can contribute to hoaxes and fake information spreading. As the thinking goes, these people are less likely to be exposed to information contrary to the posts their like-minded friends are sharing that could oust fake news and reveal the truth.

At relatively low flows of information, his algorithm predicted that a theoretical social media user was able to discriminate between genuine and fake news well, sharing mostly genuine news. However, as Oliveira and his coauthors tweaked the algorithm to reflect greater and greater flows of information—the equivalent of scrolling through an endless Twitter or Facebook feed—the theoretical user proved less and less capable of sorting quality information from bad information.

Oliveira found that, in general, popularity had a stronger effect on whether a person shared something than quality. At higher levels of information flow that effect became more pronounced, meaning people would theoretically spend less or no time assessing the information’s quality before deciding to share it. Soon, as they paid less and less attention to each piece of information, the people were sharing fake news at higher and higher rates.

At the highest rates modeled, the quality of a piece of information had zero effect on the popularity of that information. "We show that both information overload and limited attention contribute to a degradation in the system's discriminative power," Oliveira said via email.

While the model has clear limitations, it does provide one interpretation of how fake news spreads. "Traditionally it is believed that truth has some inherent power to overcome false," says Haluk Bingol, a computer engineer at Boğaziçi University in Turkey who has long studied online networks. "Similarly, the good eventually beats the bad. Social norms are based on these assumptions. Interestingly this has never been tested empirically."

Bingol, who was not involved in this study, says the study highlights how the quality the quality of information does not always win out when it comes to distribution. Oliveira’s research aligns with Bingol’s previous findings on the relationship choice and amount of information. In one paper, he found that the recommendation of a merchant advertising a certain item to a potential customer mattered even more strongly when the customer was presented with more options to choose from.

"That is, if you artificially increase the number of choices, you can obtain better results with the same 'marketing push,'" Bingol says. In other words, a person being overloaded with information is much more easy to manipulate—for advertisers, and for purveyors of fake news. "Clearly this is not difficult to do today," he adds.

Walter Quattrociocchi, a computer scientist at the IMT School for Advanced Studies Lucca in Italy, is more skeptical of Oliveira's model. "Oversimplifying the complex social dynamics behind the emergence of narratives could be misleading," says Quattrociocchi, who was not involved in this research. For instance, the model used worked on the simplified assumption that social media users introduce new information at the same rate, and that users all start with the same attention spans.

While he found the study interesting, Quattrociocchi notes that other research has shown how confirmation bias and other factors beyond the scope of Oliveira's model can significantly affect the spread of information online.

For future research, Oliveira hopes to enhance his model with some of these other facts, including how a person's relationship to the sharer of information affects how they process it, and how likely people would be to change their minds upon receiving information online that conflicts with their current beliefs.

At the end of the day, Oliveira believes that stopping fake news starts with readers. He suggests that people read carefully what they share online, avoid unfriending or unfollowing people to create an online echo chamber, and avoid assuming anything is trustworthy even if they trust the person sharing it. "Keep in mind that our friends are probably not good editors and are driven by emotions and biases more than objectivity and trustworthiness," he points out.

So give this article another read, and check out where it came from before you click “share.” 

Get the latest Science stories in your inbox.