Leaked files reveal Facebooks internal rules on violence, hate speech, porn, and more
- Monday, May 22nd, 2017
- Share this article:
Facebook has come under a lot of pressure in recent months for the way it handles offensive, extremist and violent content on its site. And nobody really knew what Facebook actually did to moderate this content. Until now.
Leaked documents, including more than 100 internal training manuals, spreadsheets and flowcharts, outlining how the social network teaches its staff to review content have been uncovered by a Guardian investigation. The documents provide blueprints of the guidance used by Facebook to moderate issues of violence, hate speech, terrorism, pornography, racism, and self-harm, as well as match-fixing and cannibalism.
Some of the guidelines within the files make for worrying reading, and will further bring the internet giant’s ethics into question. Guidelines include:
- Remarks such as ‘someone shoot Trump’ must be removed, but it’s fine to say ‘to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat’, ‘little girl needs to keep to herself before daddy breaks her face’, or ‘should off [sic] let those Zionists burn’.
- Violent deaths can stay, unless there is evidence of sadism or celebration, because they are disturbing but educational. The same goes for non-sexual physical abuse and bullying. Also, abortions are good, if there’s no nudity.
- Animal abuse isn’t a problem, because it raises awareness, but may be marked as ‘disturbing’.
- People can livestream self-harm because Facebook ‘doesn’t want to censor or punish people in distress’ – though the video content will be removed ‘once there’s no longer an opportunity to help the person’.
These are just a few of the examples of the questionable guidelines that Facebook’s moderators must follow – guidelines which allegedly leave many of the staff facing difficulties and stress.
In regards to the types of speech allowed on its platform, Facebook says in the documents: “We aim to allow as much speech as possible but draw the line at content that could credibly cause real world harm.
“People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways.
“We aim to disrupt potential real world harm caused from people inciting or coordinating harm to other people of property by requiring certain details to be present in order to consider the threat credible.”
The leak of Facebook’s internal moderation guidelines comes at a time when Facebook has been under growing pressure due to violent acts, including livestreamed murders in Cleveland, Ohio, USA and Phuket, Thailand; hate speech; and revenge porn.
To help deal with this massive influx of unsuitable content, Facebook announced it would add 3,000 moderators to the 4,500 it already has. But, can 7,500 people really deal with around 2bn users, and approximately 1.3m posts shared every minute? Probably not. Especially not with some of these policies anyway.