<img src="https://sb.scorecardresearch.com/p?c1=2&amp;c2=6035250&amp;cv=2.0&amp;cj=1&amp;cs_ucfr=0&amp;comscorekw=YouTube%2CUS+news%2CGoogle%2CAlphabet%2CTechnology"> Skip to main contentSkip to navigation Skip to navigation
YouTube did not provide a clear definition of what it considers to be harmful misinformation.
YouTube did not provide a clear definition of what it considers to be harmful misinformation. Photograph: Dado Ruvic/Reuters
YouTube did not provide a clear definition of what it considers to be harmful misinformation. Photograph: Dado Ruvic/Reuters

YouTube vows to recommend fewer conspiracy theory videos

This article is more than 5 years old

Site’s move comes amid continuing pressure over its role as a platform for misinformation and extremism

YouTube will recommend fewer videos that “could misinform users in harmful ways”, the company announced on Friday, in a shift for a platform that has faced criticism for amplifying conspiracy theories and extremism.

The change concerns YouTube’s recommendations feature, which automatically creates a playlist of videos for users to watch next. The recommendations are the result of complex and opaque algorithms designed to capture a user’s interest, but they have become a locus of criticism when YouTube directs people to potentially harmful and false content that they would not have otherwise sought out.

The company did not provide a clear definition of what it considers to be harmful misinformation, but said that some examples were “videos promoting a phony miracle cure for a serious illness, claiming the Earth is flat, or making blatantly false claims about historic events like 9/11”.

The changes will also affect “borderline content”, or videos that come close to violating the company’s rules for content without technically crossing the line.

A YouTube spokesperson did not provide additional details on what it would consider “borderline content” and said the company did not have statistics on the extent to which users discover videos through the recommendation feature. The shift would apply to less than 1% of videos, the company said.

“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the company said in a blogpost, noting that the shift would only impact recommendations and not which videos exist on the platform.

YouTube, Facebook and other social media platforms have faced growing scrutiny in recent years for their role in hosting and amplifying political propaganda and abusive content that spark real-world consequences and can lead to violence.

In 2016, the conspiracy theory that became known as “Pizzagate” – a popular rightwing fake news story alleging that the Comet Ping Pong restaurant was linked to a child sex ring involving the Hillary Clinton campaign – motivated a gunman to fire a weapon inside the restaurant.

Last week, YouTube also announced a ban on videos depicting “dangerous challenges and pranks” following a viral challenge that involved driving blindfolded.

The recommendations feature has played a significant role in pushing damaging videos and beliefs, said Andrew Mendrala, supervising attorney of Georgetown Law’s Civil Rights Clinic. Last year, the clinic filed a lawsuit against the rightwing commentator Alex Jones and his site Infowars, alleging that Jones spread defamatory conspiracy theories that led to the abuse and in-person harassment of Brennan Gilmore, a counter-protester at the violent white supremacist rally in Charlottesville.

“It’s an echo chamber. It’s a feedback loop,” Mendrala said of YouTube’s algorithms. “It creates an insular community that is continually fed misinformation that reinforces their prejudices.”

YouTube’s announcement could simply be an incremental change in a corporate policy, but given the platform’s wide reach, even small changes may have meaningful effect, Mendrala said.

Mendrala noted that YouTube was the main platform that gave Jones an audience for his attacks on Gilmore: “People’s world views were poisoned to such an extent that they threatened Brennan’s life.” Last year, under significant public pressure, YouTube, Facebook and other tech companies banned Jones’s pages.

Lenny Pozner, the father of a victim of the Sandy Hook elementary school massacre who became the subject of rampant conspiracy theories and abuse , said on Friday that Google, which owns YouTube, was the first tech company to work with him on his concerns.

He estimated that YouTube had since taken action on roughly 90% of “hoaxer content” on the platform: “Google was the first one to come around.”

Experts caution that platform changes meant to reduce the audience of conspiracy theories can often fuel the false news narrativs, with believers and others claiming that censorship is further proof of the conspiracy theory.

“It’s going to attract the ire of conspiracy theorists, certainly,” said Joseph Uscinski, a University of Miami political science professor and conspiracy theory expert. The new shift raised questions about YouTube’s standards and policies, he added: “If it’s ‘borderline’, does it cross the line? And who sets the line? … Who gets to decide what misinformation is?”

Explore more on these topics

Most viewed

Most viewed