Close Button
Newsletter Button

Sign up for our newsletter

The latest from Inc. Southeast Asia delivered to your inbox.

By signing up for newsletters, you are agreeing to our Terms of Use and Privacy Policy.
TECHNOLOGY

Facebook’s 7,500 Moderators Protect You From the Internet’s Most Horrifying Content. But Who’s Protecting Them?

Allegations that Facebook failed to keep its content moderators safe from the physical and psychological trauma of viewing graphic images shines light on a dark side of the social-media industry.

Share on
BY Christine Lagorio-Chafkin - 26 Sep 2018

Facebook's 7,500 Moderators Protect You From the Internet's Most Horrifying Content. But Who's Protecting Them?

PHOTO CREDIT: Getty Images

A new lawsuit is shedding light on an unpleasant and often overlooked reality of many high-profile Silicon Valley businesses.

On June 21 one of Facebook's content moderators--the people trusted to keep gore, pornography, and violence off of the site--filed a class-action against the company, claiming she developed post-traumatic stress and psychological trauma from viewing graphic images and videos. Selena Scola's complaint alleges Facebook failed "to provide a safe workplace for the thousands of contractors who are entrusted to provide the safest environment possible for Facebook users."

The action can hardly come as a surprise. Social media companies rely on troves of content moderators, who are largely outsourced contractors, to maintain the appeal of their sites to a broad audience and to advertisers. Now-ubiquitous, the job of "community manager" or "content moderator" is still the dirty secret that props up so much of the tech industry.

"Every day, Facebook users post millions of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder," the filing reads. This is precisely the content the company's legion of at least 7,500 content moderators are responsible for cleaning up.

Facebook has said it has workplace safety standards to protect content moderators. The California Superior Court is tasked with examining whether the company has actually maintained those standards--or whether it subjects employees to what the complaint calls "dangerous conditions that cause debilitating physical and psychological harm."

Devastating effects on moderators

Scola's allegations aren't as uncommon and extreme as they may sound. Post-traumatic stress symptoms are familiar to content monitors and community managers--the friendly-sounding job title that often also includes content policing--of multiple social sites. Snapchat has had an uphill battle against pornography; Instagram with revenge pornography, sextortion, terrorism, and more. At YouTube, which historically has claimed little knowledge of what users upload to its site, content moderators manually review videos that appear next to certain advertiser content.

I had a unique window into the life of content moderators when reporting on Reddit for my forthcoming book, We Are the Nerds, I spoke with many current and former community managers. Marta Gossage, who saw firsthand the development of community moderation as the social web became home to an increasing trove of culture--including some of the darkest corners of humanity--said Scola's grievances in the class action ring true.

"Almost everyone I know who has encountered those things regularly, has experienced some form of PTSD," says Gossage, who worked at the social-blogging platform LiveJournal and from 2011 to 2014 as a community manager at Reddit. "These things don't always emerge right away--they can come up later. The anxiety, the sleeplessness--it's insidious. It's a job that certainly changes your life."

Facebook, by its own tally, in just three months of 2018 "took action" on 21 million posts containing nudity and sexual activity, placing warnings on that content rather than removing it. An additional 3.4 million posts featuring graphic violence were removed--and that's not to mention likely millions involving hate speech, terrorism, and noxious spam.

In April, a former Facebook employee described to the BBC the horrors she'd seen moderating content for the site. "It's the most important job in Facebook, and it's the worst, and no one cares about it," she said.

Facebook posted in July that all content reviewers have access to mental health resources, and receive full health care benefits. Facebook's director of corporate communications, Bertie Thompson, told Inc. in a statement:

We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources. Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling--available at the location where the plaintiff worked--and other wellness resources like relaxation areas at many of our larger facilities.

At the same time, as recently as March, Facebook CEO Mark Zuckerberg expressed his reluctance to make broad content-policy decisions--or, essentially, draw lines that would make the site cleaner and as a result make content moderators' jobs easier. "I feel fundamentally uncomfortable sitting here in California in an office making content policy decisions for people around the world," Zuckerberg told Recode.

"Where's the line on hate speech? I mean, who chose me to be the person that did that?" Zuckerberg continued. "I guess I have to, because we're here now, but I'd rather not." Now he also has to grapple with how to provide adequate coping mechanisms and protections for thousands of employees who would "rather not" risk being inflicted with traumatic stress.

inc-logo Join Our Newsletter!
The news all entrepreneurs need to know now.

READ MORE

JetBlue Just Announced That It’s Going To Make Passengers More Uncomfortable Too

Read Next

Want to Increase Sales? Get to Know the 4 Personality Types of Your Customers

Read Next