Skip to main content
Research

Study: An Abundance of Media Fuels Polarization

Recent decades have seen an explosion of sources of news and information, as well as increased political polarization. Is there a relationship between the two trends? Yale SOM’s Vahideh Manshadi and her co-authors built a model showing that faced with a flood of information, an individual tends to take in material that reinforces their existing beliefs.

An illustration of hundreds of TV screens

Every day there are billions of posts on Facebook, 500 million tweets on Twitter, and 200 million new Instagram pictures. There are also 500 hours of video uploaded to YouTube…every minute.

Add in cable television and web sites of every size and political stripe, and that’s a lot of content reaching a lot of people. Only a few years ago, the media landscape was dominated by a handful of established outlets that adhered to a standardized set of journalistic practices. Now, the rise of social media and the splintering of mass media has seeded a proliferation of new information sources, some ostensibly neutral, others not. Is this explosion of voices just the sound of a vibrant society? Or is it drowning out the fact-based debate that democracy depends on?

“Is giving people more sources better? Does that help society to learn the truth more effectively?”

“Is giving people more sources better?” asks Vahideh Manshadi, an associate professor of operations at Yale SOM. “Does that help society to learn the truth more effectively, to basically get to the same opinion, which is hopefully close to the truth?”

This question motivates a recent study from Manshadi, Gad Allon of the University of Pennsylvania, and Kimon Drakopoulos of the University of Southern California, for which the three built a model to explore the relationship between abundant news sources and the beliefs that people form.

The model follows a simple series of steps. Individuals are presented with several “posts” about an issue. Each post takes a different perspective and originates from a different source. Some sources have a known degree of credibility while others don’t. Each “person” then chooses to read one of the posts and incorporate the information that it provides into their own opinions.

Importantly, when a group of sources has unknown credibility—as is often the case on social media—individuals will filter what they read based on prior beliefs. This “confirmation bias,” in which people look for or interpret information in ways that support existing beliefs, has a long history in psychology literature. In this case, it leads to an outcome that is troubling, if unsurprising to anyone who follows the news: polarization. Faced with a flood of information, an individual tends to takes in material that reinforces those beliefs.

“The fact that there are so many sources means we cannot consume them all. And so we become selective in our own ways, which results in polarization.”

“The fact that there are so many sources means we cannot consume them all,” Manshadi says. “And so we become selective in our own ways, which results in polarization.”

Manshadi notes that the model that she and her colleagues used represents a best-case scenario, in which people approach a set of posts with a uniformly distributed set of beliefs, and with no prejudice other than the screening bias. In real life, people are often vehemently split across ideological divides while, at the same time, covert efforts are underway on social media to amplify existing polarization.

“We look at a very simple model, and we try to avoid factors that we know would make confirmation bias worse,” Manshadi says. “But even without those factors, even without including initial biases that people might have before reading any pieces of news, this polarization exists. Now if you include other biases, it’s just going to get worse.” She and her colleagues are currently conducting a lab experiment to explore how the effect plays out in a more realistic setting.

Manshadi says a partial fix would be relatively straightforward. First, people need to actively suppress their biases and expose themselves to diverse news sources; in other words, they urge consumers, read beyond the bubble of your social network. On the other side of the equation, companies like Facebook and Google should do a better job actively sorting legitimate information from untruths and conspiracies. She and her colleagues write: “We believe that online news platforms should invest in technology and third-party fact-checking to evaluate and label posts when presenting them to users.”

Several years ago, Facebook did just that, announcing that it would begin flagging untrustworthy posts. But users questioned whether Facebook’s method for flagging was biased. Was the company, for instance, favoring more liberal sources? The company decided instead that simply presenting a diversity of information would help people rationally weigh and consider an issue and, that way, converge on a generally accepted truth. Manshadi’s work seems to offer the opposite conclusion.

Even as companies like Facebook, Twitter, and Google have tinkered with methods for controlling the most extreme content, they have long and vehemently insisted that they are tech platforms, not publishers or media companies, and therefore don’t bear responsibility—or, under section 230 of the Communications Decency Act of 1996, legal liability—for the information, and misinformation, they disseminate. Until their approach changes, it will fall on consumers to think carefully about the role of social media in U.S. democracy.

“In one sense, these platforms are playing a positive role by facilitating our access to more news sources,” Manshadi says. “But as this trend is emerging, we also see more and more polarization and misinformation in society.”

Department: Research