Traumatized and underpaid: what Facebook's leaked files reveal
Recently on Spark, we've talked about what responsibility Facebook should take when users post violent and distressing content on the site.
Facebook has been vocal about its plans to combat this problem, including announcing that it will be hiring 3,000 content moderators in addition to the 4,500 people around the world already doing the job.
There are guidelines for those moderators, which were recently leaked to The Guardian newspaper.
"There's two billion people on Facebook and they have a central assumption about themselves which is that what Facebook does is merely reflect back what people do," Alexis says.
"There's almost no way in which this is an unstructured or just 'pure mirror' environment of regular human social life. It, in fact, has reshaped the way people interact with each other."
Alexis argues that until Facebook stops believing that it's merely a reflection of society -- but an influencer -- it won't be able to actively reckon with the problems it has on its hands.
"These contractors are paid very little and on a daily basis are looking at child sexual abuse, beheadings and other very violent acts perpetrated by terrorists," Olivia says.
According to moderators she spoke to anonymously, they only get two weeks of training -- where they have to absorb a huge amount of information on how to deal with "a whole range of terrible behaviour of humanity.
"The job is pretty grim, the training and psychological support was not sufficient and that people were leaving after experience burnout and requiring psychological support for symptoms that appeared like PTSD."
Olivia says Facebook needs to value its contracted workforce more, although she concedes she's not sure whether it is Facebook or the contracting companies that should be providing that help.
"But either way, Facebook needs to ensure that those contracted companies are delivering even more emotional support to those individuals."