Tech

Former YouTube content moderator describes horrors of the job in new lawsuit

Key Points
  • A contractor who worked as a moderator for YouTube through 2019 alleges the company and its third-party vendors failed to support employees tasked with viewing violent content.
  • The plaintiff, referred to as Jane Doe, said she had to view mass murders, abortions and bestiality among other violent videos.
  • Her lawsuit alleges the companies didn't adhere to their own moderation standards and didn't provide necessary information prior to her starting the job.
Symbolic photo on the topic of upload filter. An youtube icon for blocked content .
Thomas Trutschel | Photothek | Getty Images

A former YouTube moderator is suing YouTube, accusing it of failing to protect workers who have to catch and remove violent videos posted to the site. 

The suit filed Monday in California Superior Court in San Mateo says the plaintiff was required to watch murders, abortions, child rape, animal mutilation and suicides. As a part of moderator training, the company allegedly presented a video of a "smashed open skull with people eating from it," a woman who was kidnapped and beheaded by a cartel and a person's head being run over by a tank. 

YouTube parent company Google faces increasing pressure to control content spanning violence and misinformation — particularly as it approaches the 2020 U.S. election and antitrust investigations from state attorneys general, the Department of Justice and Congress.

The plaintiff, who is referred to as Jane Doe, worked as a YouTube content moderator for staffing contracting firm Collabera from 2018 to 2019, her lawsuit said. She claims she experienced nightmares, panic attacks and suffered an inability to be in crowded areas as a result of the violent content she viewed while working for the company.

YouTube's "Wellness Coaches" weren't available for people who worked evening shifts and were not licensed to provide professional medical guidance, the suit says. It also alleges moderators had to pay for their own medical treatment when they sought professional help.

Neither YouTube nor Collabera responded to requests for comment.

The suit says many content moderators remain in their positions for less than a year and that the company is "chronically understaffed," so moderators end up working overtime and exceeding the company's recommended four-hour daily viewing limit. Despite the demands of the job, moderators had little margin for error, the suit said.

The company expects each moderator to review 100 to 300 pieces of video content each day with an "error rate" of 2% to 5%, the suit said. The companies also control and monitor how the videos are displayed to moderators: whether in full-screen versus thumbnails, blurred or how quickly they moderators watch in sequence. 

The suit comes as moderators for social media companies speak out on the toll the job takes on their mental health. YouTube has thousands of content moderators and most work for third-party vendors including Collabera, Vaco and Accenture. The San Francisco-based Joseph Saveri Law Firm, which is  representing the plaintiff, filed a similar lawsuit against Facebook that resulted in $52 million settlement in May.

It shows YouTube may need to provide more resources for the people who need to remove videos that violate the rules. YouTube has reportedly reverted back to relying on humans to find and delete content after it used computers to automatically sift through videos during the coronavirus pandemic. It switched back to human content moderators because computers were censoring too many videos that didn't violate rules.