Skip to main content

Former YouTube content moderator sues the company after developing symptoms of PTSD

Former YouTube content moderator sues the company after developing symptoms of PTSD

/

The unnamed moderator says YouTube failed in its duty of care

Share this story

Illustration by Alex Castro / The Verge

A former YouTube content moderator is suing the Google-owned company for failing to properly protect her and her co-workers from the mental harms caused by reviewing hours and hours of graphic footage every day.

The proposed class-action lawsuit against YouTube is being brought by the Joseph Saveri Law Firm, which previously sued Facebook for failing to safeguard the mental health of its own content moderators. That earlier suit resulted in Facebook paying a $52 million settlement to moderators who developed PTSD as a result of their work for the company.

“She has trouble sleeping and when she does sleep, she has horrific nightmares.”

The lawsuit, which was first reported by CNET, says YouTube consistently failed to follow its own safety guidelines and provided inadequate support to moderators. As a result of her time working for the company, the lawsuit’s plaintiff, who remains anonymous, says she suffered “severe psychological trauma” and developed symptoms of PTSD and depression.

The lawsuit says videos that the plaintiff had to watch and review during her employment included footage of cannibalism, child rape, suicide, self-harm, bestiality; videos of a woman being beheaded by a cartel, of a person’s head being run over by a tank, of a fox being skinned alive, and of school shootings.

“She has trouble sleeping and when she does sleep, she has horrific nightmares,” says the lawsuit of the plaintiff. “She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She cannot be in crowded places, including concerts and events, because she fears mass shootings. She has severe and debilitating panic attacks.”

The plaintiff in this recent lawsuit reviewed YouTube content via a third-party agency called Collabera, working for the firm between January 2018 and August 2019. Such third-party agencies are often used by tech companies like Google and Facebook. This work is frequently low-paid with short-term contracts and minimal health benefits. Employees have to sign non-disclosure agreements (NDAs) to stop them talking about their work publicly.

The lawsuit details a number of alleged failings from YouTube and Collabera, including the following:

  • During the training process, new employees were exposed to graphic content without proper guidance or preparation. Trainees are told they can leave the room when being shown this content but the lawsuit says people were concerned that “leaving the room might mean losing their job.”
  • During training “little to no time was spent on wellness and resiliency.” Counselors guiding trainees told them to get enough sleep and exercise and take regular breaks during work, but when content moderators started full-time employment the pace of the job meant “these promised breaks were illusory.”
  • YouTube’s best practices says moderators should not view graphic content for more than four hours a day, but due to “chronically understaffed” workplaces this limit was “routinely” exceeded.
  • Support services for content moderators included access to “Wellness Coaches,” but these coaches were not medically trained professionals who could diagnose or treat mental health disorders. One coach counseled the lawsuit’s plaintiff to take illegal drugs to cope with her symptoms while another coach told a co-worker to simply “trust in God.”
  • Content moderators were fearful that any complaints to coaches would be reported to management and so were not able to speak freely about their problems on the job.

The lawsuit also highlights the lengths YouTube has gone to “to shield itself from liability.” It notes that the company began forcing content moderators to sign a statement acknowledging that the job can give them PTSD as of December last year, four days after The Verge published an investigation into the trauma caused by their work.

YouTube has repeatedly said it would use AI systems to relieve the burden on human moderators, but just this week the company admitted that such automated filters were not as accurate. YouTube and other tech platforms are facing increasing scrutiny over their moderations duties not only because of the trauma afflicted on employees but also the spread of racist content and misinformation.

The lawsuit is currently filed on the behalf of the individual plaintiff but is proposed as a class action suit “on behalf of all persons who performed content moderation work for YouTube in the United States at any time up until the present.”