Moengage

UK MPs says tech firms should be made liable for fake news

Tim Maytom

Platforms like Twitter, Facebook and YouTube should be made liable for "harmful and misleading" material published on their websites and pay a levy so that they can be effectively regulated, British lawmakers have suggested.

The idea was suggested by the Digital, Culture, Media and Sport Committee, chaired by Damian Collins MP, in its interim report from an inquiry into disinformation and 'fake news'.  The report also warned that the rampant misuse of personal data could lead to serious problems going forward.

"We are facing nothing less than a crisis in democracy - based on the systemic manipulation of data to support the relentless targeting of citizens, without their consent, by campaigns of disinformation and messages of hate," said Collins.

The interim report also suggested that the rules on political campaigning should be updated to reflect the digital world, with a public register of political advertising designed to increase transparency and accountability. The report also called for a ban on "micro-targeted political advertising" that made use of Facebook's 'lookalike audiences' targeting.

"Companies like Facebook made it easy for developers to scrape user data and to deploy it in other campaigns without their knowlege or consent," said Collins. "Throughout our inquiry these companies have tried to frustrate scrutiny and obfuscate in their answers. They must be made responsible, and liable, for they way in which harmful and misleading content is shared on their sites."

The report also specificially criticised Facebook for its role in spreading misinformation in Myanmar during ethnic cleansing that occurred there. It joined the UN in blaming Facebook's Free Basics platform, popular in Myanmar, for spreading hate speech against the Rohingya ethnic group.

The report asserted that Facebook's platform in the country had been co-opted, and had potentially worsened the ongoing ethnic cleansing, as well as curbing the success of aid programmes designed to help.

"The activity of Facebook undermines international aid to Burma, including the UK Government's work," said the report. "Facebook is releasing a product that is dangerous to consumers and deeply unethical."

Myanmar's military has displaced more than 500,000 Rohingya during a brutal campaign that has involved murder, arson, torture, sexual assault and the destruction of entire communities. The events have been met with international condemnation.

Facebook's Free Basics program was ended in Myanmar in September 2017, but the firm has acknowledged that the platform didn't do enough to address hate speech posted using its services. Mark Zuckerberg admitted that users were trying to incite "real harm" using Facebook.

"We share the UK Parliamentary Committee's ambitions to tackle misinformation online," said a Facebook spokesperson. "We have acknowledged that we were too slow to address the abuse of our services in Myanmar, and are taking active steps to reduce the spread of misinformation and hate speech, including removing fake and abusive accounts. We are investing in people, technology and programs to help address very serious challenges in Myanmar, and we will continue to engage with the UK Parliamentary Committee and others on our work there."