Facebook opens its Community Guidelines for all to see how it decides what to censor

Facebook has gone public on how it decides what is and is not allowed on its social network, publishing, in full, the Community Standards guidelines it issues to its employees responsible for policing content, looking for instances of things such as hate speech, child abuse and terrorism. The company is also giving users the right to appeal its decisions on individual posts so they can ask for a second opinion when they think Facebook has made a mistake.

In a statement posted on Facebook’s website, Facebook’s VP of global product management, Monika Bickert, said the company had decided to publish the internal guidelines for two reasons. “First, the guidelines will help people understand where we draw the line on nuanced issues,” she wrote. “Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.”

In the statement, Bickert references the experts policing Facebook’s content, and the wide range of specialisms they cover. In addition, she writes, the team seeks input from experts and organizations outside Facebook so that they can better understand different perspectives on safety and expression, as well as the impact of Facebook’s policies on different communities globally.

“Based on this feedback, as well as changes in social norms and language, our standards evolve over time,” she writes. “What has not changed – and will not change – are the underlying principles of safety, voice and equity on which these standards are based. To start conversations and make connections people need to know they are safe. Facebook should also be a place where people can express their opinions freely, even if some people might find those opinions objectionable. This can be challenging given the global nature of our service, which is why equity is such an important principle: we aim to apply these standards consistently and fairly to all communities and cultures. We outline these principles explicitly in the preamble to the standards, and we bring them to life by sharing the rationale behind each individual policy.”

She goes on to acknowledge that Facebook’s enforcement isn’t perfect, and outlines the measures being taken to improve it, such as deploying AI, and increasing the number of content reviewers by 40 per cent over the past year, to 7,500.

The right to appeal will begin with posts that were removed for nudity/sexual activity, hate speech or graphic violence. Anyone who has had a post removed for these reasons will be notified, and given the option to request additional review. This will lead to a review by a (human) member of Facebook’s team, typically within 24 hours. If the decision is overturned, the post will be restored and the user notified.

In time, Facebook plans to widen the appeal process to other violation types, and to include content that was reported but not taken down. Next month, the company will also launch a series of public events called Facebook Forums: Community Standards, in Germany, France, the UK, India, Singapore, the US and other countries, where it will get people’s feedback directly.

You can see the Community Standards in full here.

Separately, Facebook has posted another statement, also from Monica Bickert, on how it is deploying technology to keep terrorists off its platform. The main point of this statement, apart from defining what Facebook considers terrorism, is to flag up what a good job it’s doing to counter it. Four bullet-pointed paragraphs make the points that Facebook is removing more content; that it finds the vast majority of this content itself; that it takes down newly-uploaded content quickly; and that old content is removed with the same vigour as new.

Finally, for now at least. Rob Goldman, vice president of ads, has posted another statement explaining what information Facebook’s advertisers know about the platform’s users. In it, he explains that data shared with advertisers about users’ behaviour on Facebook, or on sites and in apps that use Facebook ad tools, is non-personally identifiable.

“If a bike shop comes to Facebook wanting to reach female cyclists in Atlanta, we can show their ad to women in Atlanta who liked a Page about bikes,” he writes. “But here’s what’s key: these businesses don’t know who you are. We provide advertisers with reports about the kinds of people seeing their ads and how their ads are performing, but we don’t share information that personally identifies you.”

Later in the statement, Goldman explains that Facebook users can’t opt out of ads “because ads are what keeps Facebook free” but explains that users do have different options to control how their data can and can’t be used to show them ads, with a link to more information about ad preferences.

All interesting and no doubt useful for the average Facebook user to know. Only thing is, this appears in the Facebook news room, which is not a place I imagine many of them frequent. We have asked if anything is being done to push this information into users’ news feeds, no response as yet. We will update this piece when we get one.