Facebook moderator: I had to be prepared to see anything

  • Published
Sarah Katz
Image caption,
Sarah Katz worked at Facebook for eight months

This article contains descriptions of child sexual abuse and other acts readers may find disturbing.

"It's mostly pornography," says Sarah Katz, recalling her eight-month stint working as a Facebook moderator.

"The agency was very upfront about what type of content we would be seeing, in terms of how graphic it was, so we weren't left in the dark."

In 2016, Sarah was one of hundreds of human moderators working for a third-party agency in California.

Her job was to review complaints of inappropriate content, as flagged by Facebook's users.

She shared her experience with BBC Radio 5 live's Emma Barnett.

"They capped us on spending about one minute per post to decide whether it was spam and whether to remove the content," she said.

"Sometimes we would also remove the associated account.

"Management liked us not to work any more than eight hours per day, and we would review an average of about 8,000 posts per day, so roughly about 1,000 posts per hour.

"You pretty much learn on the job, specifically on day one. If I had to describe the job in one word, it would be 'strenuous'.

Image source, AFP

Illegal images

"You definitely have to be prepared to see anything after just one click. You can be hit with things really fast without a warning.

"The piece of content that sticks with me was a piece of child pornography.

"Two children - the boy was maybe about 12 and the girl about eight or nine - standing facing each other.

"They weren't wearing pants and they were touching each other. It really seemed like an adult was probably off camera telling them what to do. It was very disturbing, mostly because you could tell that it was real.

Reappearing posts

"A lot of these explicit posts circulate. We would often see them pop up from about six different users in one day, so that made it pretty challenging to find the original source.

"At the time there was nothing in the way of counselling services. There might be today, I'm not sure."

Sarah says she would probably have taken up counselling if it had been offered.

"They definitely warn you, but warning you and actually seeing it are different.

Image source, AFP

"Some folks think that they can handle it and it turns out they can't, or it's actually worse than they expected."

Graphic violence

"You become rather desensitised to it over time. I wouldn't say it gets any easier but you definitely do get used to it.

"There was obviously a lot of generic pornography between consenting adults, which wasn't as disturbing.

"There was some bestiality. There was one with a horse which kept on circulating.

"There's a lot of graphic violence, there was one when a woman had her head blown off.

"Half of her body was on the ground and the torso upwards was still on the chair.

"The policy was more stringent on removing pornography than it was for graphic violence."

Fake news

"I think Facebook was caught out by fake news. In the run-up to the US election, it seemed highly off the radar, at least at the time I was working there.

"I really cannot recall ever hearing the term 'fake news'.

"We saw a lot of news articles that were circulating and reported by users, but I don't ever recall management asking us to browse news articles to make sure that all the facts were accurate.

"It's very monotonous, and you really get used to what's spam and what's not. It just becomes a lot of clicking.

"Would I recommend it? If you could do anything else, I would say no."

Facebook responds

The BBC shared Sarah's story with Facebook.

In response, a Facebook spokesman said: "Our reviewers play a crucial role in making Facebook a safe and open environment.

"This can be very challenging work, and we want to make sure they feel properly supported.

"That is why we offer regular training, counselling, and psychological support to all our employees and to everyone who works for us through our partners.

"Although we use artificial intelligence where we can, there are now over 7,000 people who review content on Facebook, and looking after their wellbeing is a real priority for us."