Advertisement

SKIP ADVERTISEMENT

Facebook and YouTube Give Alex Jones a Wrist Slap

On Friday, Facebook barred Alex Jones, who runs the Infowars website, from posting for 30 days because of repeated policy violations. YouTube has placed a first strike against his account; two more within 90 days would mean termination.Credit...Ilana Panich-Linsman for The New York Times

The digital walls are closing in on Alex Jones, the social media shock jock whose penchant for right-wing conspiracy theories and viral misinformation set off a heated debate about the limits of free speech on internet platforms.

Facebook said on Friday that it had suspended Mr. Jones from posting on the site for 30 days because he had repeatedly violated its policies. The social network also took down four videos posted by Mr. Jones and Infowars, the website he oversees.

“We received reports related to four different videos on the pages that Infowars and Alex Jones maintain on Facebook,” a Facebook spokeswoman said in an emailed statement. “We reviewed the content against our Community Standards and determined that it violates. All four videos have been removed from Facebook.”

The 30-day ban applies only to Mr. Jones personally, not to Infowars or to any of the other administrators of his Facebook page, which has nearly 1.7 million followers. Those people will still be able to post to Mr. Jones’s page as long as their posts don’t violate the site’s policies — meaning that Mr. Jones could still appear in videos and stories posted to the page as long as he does not post them personally.

In fact, Mr. Jones appeared on a live-streamed Facebook video on his page on Friday, shortly after the suspension went into effect, in which he claimed that he was the victim of a media conspiracy to “de-platform” conservative voices.

“This is war,” Mr. Jones said in the video.

Facebook may remove Mr. Jones’s page altogether if he continues to violate its policies, the spokeswoman said. This week, Facebook determined that one of Mr. Jones’s recent videos — an inflammatory rant in which he accused Robert S. Mueller III, the special counsel, of supporting pedophilia and pantomimed shooting him — did not violate its policies.

On Tuesday, YouTube took down four videos uploaded to Mr. Jones’s channel, which has 2.4 million subscribers, for violating its policies on hate speech and child endangerment. The violation placed a first strike against Mr. Jones’s account on YouTube, preventing the channel from streaming live video for 90 days. If Mr. Jones receives two more strikes during that period, YouTube will terminate his account.

“We apply our policies consistently according to the content in the videos, regardless of the speaker or the channel,” YouTube said in a statement.

A request to Mr. Jones for comment was not immediately returned.

This is not the first time that Mr. Jones’s videos have received a strike from YouTube. In February, YouTube levied a strike for a video claiming that David Hogg, one of the outspoken student survivors of the school shooting in Parkland, Fla., was a “crisis actor.” YouTube said the video had violated its policies around harassment and bullying. But since there were no additional violations during the next 90 days, the strike was removed from the account.

Facebook and YouTube acted after weeks of controversy over Mr. Jones, who first gained notoriety by insisting that the terrorist attacks of Sept. 11, 2001, were an “inside job” by the United States government. Since then, he has questioned whether the 2012 massacre at Sandy Hook Elementary School was a hoax, promoted the so-called Pizzagate conspiracy theory and said fluoridated water was part of a government mind-control plot.

Despite these unsupported views, social media platforms have allowed him to gain a wide audience. Conservatives have accused Facebook, YouTube and other platforms of censoring right-wing views in the past, and have rallied behind him before.

This month, at a press event in New York about Facebook’s efforts to combat misinformation and false news, a reporter from CNN questioned company executives about why Infowars was still allowed to have a Facebook account. At the time, the company appeared unwilling to say Mr. Jones’s content violated its policies.

“Look, as abhorrent as some of this content can be, I do think that it gets down to this principle of giving people a voice,” Mark Zuckerberg, Facebook’s chief executive, said in a Recode podcast interview.

As an example, Mr. Zuckerberg cited Holocaust denial as a message that he found personally offensive but was wary of removing from Facebook, in order to protect users’ free-speech rights.

Within Facebook, the free-speech issues raised by Infowars and Mr. Jones have become an especially contentious topic. Employees have used internal chat forums to question executives about the site’s policies, according to one Facebook employee, who asked to remain anonymous because of fear of retribution. One group of Facebook workers, which included people of Jewish and Eastern European descent, raised Mr. Zuckerberg’s position on Holocaust denial with their superiors, saying they found it incomprehensible, according to the employee.

At a House Judiciary Committee hearing this month, Democratic lawmakers pressed Monika Bickert, Facebook’s global head of policy management, about why Infowars was allowed to remain on Facebook. Several days later, two parents of a child killed in the Sandy Hook shooting wrote an open letter to Mr. Zuckerberg, criticizing him for allowing Mr. Jones and his followers to use Facebook to harass and intimidate the families of victims.

“Our families are in danger as a direct result of the hundreds of thousands of people who see and believe the lies and hate speech, which you have decided should be protected,” the parents wrote.

Facebook’s policies about misinformation have been vague and inconsistently applied, and the company has appeared flat-footed when dealing with popular purveyors of conspiracy theories and hyperpartisan content such as Mr. Jones and Infowars.

In briefings with reporters this month, Facebook executives struggled to define the company’s policies regarding accounts that repeatedly post false or misleading news. The executives said that if third-party fact-checkers found roughly one-third of an account’s posts false, the account would be demoted, or “down-ranked,” in order to limit its visibility. The company has refused to reveal a list of accounts that have been down-ranked. Later, the company said it would remove, rather than down-rank, misinformation that could lead to physical violence.

Daisuke Wakabayashi and Sheera Frenkel contributed reporting.

Advertisement

SKIP ADVERTISEMENT