Facebook has announced a series of steps aimed at expanding its fight back against fake news, building on existing efforts the social network has already made after misinformation and false stories placed on its platforms were blamed for influencing the 2016 US election, stirring up racial hatred and more.
Over the past 18 months, Facebook has already partnered with fact-checking firms, removed fake accounts and promoted news literacy. The new initiatives aim to build on its current programs while still maintaining transparency and free speech on its platform.
Facebook is expanding its third-party fact-checking program to additional countries beyond the 14 that the initiative already covers, as well as expanding its testing process to include photos and videos posted on the site. According to Facebook, its fact checking process reduces the distribution of stories rated as false by an average of 80 per cent, preventing them from spreading too widely.
Photos and videos will now be fact-checked in four countries, with tests including checking video for manipulation or editing that suggests false events, and photos taken out of context or used in association with stories they do not actually illustrate.
In addition, Facebook is attempting to increase the impact that its fact-checking has by using new techniques, including introducing machine learning to augment decisions made by real fact-checkers, and using Schema.org's Claim Review, an open-source framework that is used by multiple tech companies and fact-checking organisations. These changes are designed to help Facebook respond faster to stories, expecially in times of crisis, and address a wider range of posts.
While repeat offenders are already identified by fact-checkers to reduce their distribution, remove their ability to monetise and, in some cases, ban them from Facebook, machine learning will now be used to help identify and demote foeign pages that are likely to spread financially-motivated hoaxes to people in other countries.
Finally, Facebook is aiming to improve measurement and transparency by partnering with academics. Its new initiative aimed at looking at the role of social media on elections and democracy will become fully independent, hiring staff and establishing the legal and organisation procedures needed to measure the volume and effects of misinformation on Facebook.