Facebook has published a white paper, 'Charting a Way Forward: Online Content Regulation’, setting out questions that regulation of online content might address.
The paper poses four questions which, Facebook says, go to the heart of the debate about regulating content online:
The first is: how can content regulation best achieve the goal of reducing harmful speech while preserving free expression? By requiring systems such as user-friendly channels for reporting content or external oversight of policies or enforcement decisions, and by requiring procedures such as periodic public reporting of enforcement data, regulation could provide governments and individuals the information they need to accurately judge social media companies’ efforts, Facebook believes.
The second question is: how can regulations enhance the accountability of internet platforms? Here, Facebook says that regulators could consider certain requirements for companies, such as publishing their content standards, consulting with stakeholders when making significant changes to standards, or creating a channel for users to appeal a company’s content removal or non-removal decision.
Thirdly, should regulation require internet companies to meet certain performance targets? Here, Facebook argues that companies could be incentivized to meet specific targets, such as keeping the prevalence of violating content below some agreed threshold.
Finally, should regulation define which “harmful content” should be prohibited on the internet? Here, Facebook notes that laws restricting speech are generally implemented by law enforcement officials and the courts, but says that internet content moderation is fundamentally different. Facebook believes that governments should create rules to address this complexity that recognize user preferences and the variation among internet services, that can be enforced at scale, and that allow for flexibility across language, trends and context.
Facebook believes that the development of regulatory solutions should involve not just lawmakers, private companies and civil society, but also those who use online platforms. The company references five principles it believes are important, based on lessons it has learned from its work in combating harmful content and its discussions with others. There are:
Incentives – Ensuring accountability in companies’ content moderation systems and procedures will be the best way to create the incentives for companies to responsibly balance values like safety, privacy, and freedom of expression.
The global nature of the internet – Any national regulatory approach to addressing harmful content should respect the global scale of the internet and the value of cross-border communications. They should aim to increase interoperability among regulators and regulations.
Freedom of expression – In addition to complying with Article 19 of the ICCPR (and related guidance), regulators should consider the impacts of their decisions on freedom of expression.
Technology – Regulators should develop an understanding of the capabilities and limitations of technology in content moderation and allow internet companies the flexibility to innovate. An approach that works for one particular platform or type of content may be less effective (or even counterproductive) when applied elsewhere.
Proportionality and necessity – Regulators should take into account the severity and prevalence of the harmful content in question, its status in law, and the efforts already underway to address the content.
Facebook said it hopes the white paper will help to stimulate further conversation around the regulation of content online, and that it plans to publish similar papers on elections and privacy in the coming months.