Facebook
A picture, taken on Feb. 5, 2019, shows the Facebook logo displayed on a tablet in Paris. LIONEL BONAVENTURE/Getty Images

Content monitoring has had a spotlight shone on it over the last few years. One of the companies to have the biggest spotlight on it has been Facebook, whether it’s been the 2016 election or Alex Jones controversy. And thanks to a new report, we now have insight into the damage being a moderator for the social media giant can have.

A new report from The Verge has pulled back the curtain on the life of moderators hired to monitor Facebook posts for violent videos, sexual comments, and conspiracy theories. The report focuses specifically on Cognizant, a Phoenix-based contractor who employs over 1,000 people to go through thousands of Facebook posts every day. According to those employees, that material combined with a strict work environment takes its toll.

While not identified by name, several employees have spoken about the alleged psychological trauma the job has had on them. One employee, after reviewing a video of a stabbing, said, “I have a genuine fear over knives. I like cooking — getting back into the kitchen and being around the knives is really hard for me.” One employee even started carrying a gun out of fear of unstable former co-workers.

The company does offer on-site counseling, but many employees said that it wasn’t helping enough and they had to seek professional help themselves. To cope, some employees have reportedly turned to offensive jokes, drug-use, or in-office sex if they can’t get professional help. This, combined with relatively low pay, add to a reported high burnout rate, according to The Verge.

Following the release of the article, Facebook published a company-wide email responding to the accusations. "We know there are a lot of questions, misunderstandings and accusations around Facebook’s content review practices — including how we as a company care for and compensate the people behind this important work," Facebook began its statement.

"We are committed to working with our partners to demand a high level of support for their employees; that’s our responsibility and we take it seriously..."

And this isn't the first time this issue has come up. In September 2018, a former employee sued Facebook for not providing means of helping content moderators cope properly with the stress of their jobs. That, in combination with this new report, just further highlights the toxicity that has seemed to take over social media.