Fake News

How Facebook’s Infowars Crisis Exposes Mark Zuckerberg’s Hypocrisy

Facebook says it can’t get rid of a conspiracy-theory site, but will penalize it by showing its posts to a smaller audience. That doesn’t seem to align with its philosophy of community.
Image may contain Alex Jones Human Person Interior Design Indoors Clothing Apparel and Lighting
By Ilana Panich-Linsman/The New York Times/Redux.

On Wednesday evening, Facebook convened a small group of media and tech reporters at its New York headquarters with its head of News Feed. This product—the core stream of posts shared by pages, publishers, and people’s friends—has been heavily scrutinized in the wake of the 2016 election, during which Facebook, like a handful of other tech platforms, was compromised and manipulated by foreign actors. During its meet-and-greet with reporters, billed as “a presentation about our work to prevent the spread of false news,” Facebook screened a nearly 12-minute movie it released in May called “Facing Facts,” a glossily produced sizzle reel directed by Academy Award-winning documentary filmmaker Morgan Neville. In one scene, a Facebook employee draws four quadrants on the board along an x-axis labeled “TRUTH” and a y-axis labeled “INTENT TO MISLEAD,” an ostensible attempt to grapple with the nuances of news, truth, right, and wrong on the Internet. The employee then gestures to the upper-left quadrant, describing news people share on Facebook that’s low in truth but high in its desire to mislead—in other words, “things that were explicitly designed and architected to be viral,” he explains. “These are the hoaxes of the world. These are things like Pizzagate. This is just false news. We have to get this right if we’re going to regain people’s trust.”

In theory, this is a step in the right direction. Attempting to weed out posts about things like Pizzagate, a conspiracy first proffered by the likes of Infowars’ Alex Jones, would seem to align with Facebook’s stated goal of minimizing the impact of stories intended to mislead. But the company’s commitment to this philosophy was thrown into question immediately after the presentation, when Sara Su, a Facebook News Feed product specialist, and John Hegeman, the chief of Facebook’s News Feed, took a question from CNN reporter Oliver Darcy. Darcy asked how Facebook could allow Infowars, which has a Facebook page with more than 900,000 followers, to continue to operate on its platform. In response, Hegeman told Darcy that Facebook doesn’t “take down false news . . . I guess just for being false that doesn’t violate the community standards,” he went on, adding that Infowars has “not violated something that would result in them being taken down.”

While Infowars’ conspiracy theories “can be really problematic” and “it bugs me too,” said Su, the organization’s page represents a gray area: Facebook is focused on taking down content that “can be proven beyond a doubt to be demonstrably false,” a criterion that, to Facebook’s mind, rules out Infowars, whose posts claim that both the Sandy Hook shooting and 9/11 were hoaxes, and, more recently, that Democrats were going to start a civil war on the Fourth of July. “I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice,” Hegeman said. “And different publishers have very different points of view.” In a follow-up statement to Darcy, Facebook again justified the logic behind its Infowars verdict: “We work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that down-ranking inauthentic content strikes that balance,” Facebook spokeswoman Lauren Svensson said. “In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.”

Interestingly, this line of reasoning appears totally divorced from one of Facebook’s overarching philosophies. Back in January, C.E.O. Mark Zuckerberg pledged to begin the year with a fresh approach, announcing that he was “changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” Doing so, he said, would allow the company to hew closer to its original goal: “We built Facebook to help people stay connected and bring us closer together with the people that matter to us,” he wrote. “That’s why we’ve always put friends and family at the core of the experience.” This announcement coincided with a change in Facebook’s News Feed to emphasize posts from personal connections. By leaning on its message of community-building—whether that be through a one-on-one Messenger conversation, a closed Facebook group where people share recipes, or a status update that reaches hundreds or thousands of others—Facebook is arguing that connections on the platform are valuable. It seems counter-intuitive, then, that the social-media giant would make the penalty for spreading misinformation a smaller audience, as opposed to no audience at all—doing so implies that a more intimate community is a punitive measure, which contradicts Zuckerberg’s January overtures.

As Thursday pressed on, Facebook was forced to contend with a steady stream of posts questioning its methodology. “We just don’t think banning Pages for sharing conspiracy theories or false news is the right way to go. They seem to have YouTube and Twitter accounts too—we imagine for the same reason,” Facebook’s corporate Twitter account wrote to The New York TimesKevin Roose, name-checking other social-media platforms, perhaps out of a sense of growing frustration at being singled out. NBC News reporter Ben Collins questioned why Facebook was equating Infowars with “pages on both the left and the right pumping out what they consider opinion or analysis—but others call fake news.” In other words: why bring a company peddling conspiracy theories into a conversation about news bias? “Sorry you feel that way,” Facebook responded, again via its corporate account. “The question we face is whether to ban a Page for peddling information debunked by fact-checkers or to demote it so fewer people see it. We’ve chosen the second.”

In choosing to allow sites like Infowars to maintain pages on free-speech grounds, Facebook has once again resisted taking a stand against users who abut—but do not cross—the line between permissible and policy violation. Yet, at least based on the explanations the company has offered this week, Facebook’s milquetoast method of dealing with accounts in these so-called gray areas seems to conflict with its stated goals, both of building community and of weeding out misinformation. In glibly allowing bad actors like Infowars to continue existing on its platform, Facebook is enabling the persistent spread of low-truth stories intended to mislead. And until doing so affects its stock value or its ability to turn a profit, Facebook has little incentive to change. “A reckoning? What reckoning?” an investor told me this week. “Facebook escaped any lasting impact of a privacy scandal—this was made clear during its last earnings report.”