MPs slate Google, Facebook and Twitter on hate speech

Executives from Facebook, Google, and Twitter were lambasted by MPs yesterday as the Home Affairs Committee took them to task over their approach to removing abusive and extremist content from their platforms.
Facebook and Google told the Committee that between them they had hired 17,500 staff to monitor their platforms for abusive content, but members of the Committee were unimpressed, pointing to numerous failings across all three platforms.

Conservative MP Tim Loughton dismissed Facebook’s concerns about removing Britain Firsts profile page, following the removal of its leaders pages from Twitter, on the grounds that the firm is “very cautious” about removing political speech. Facebook’s director of public policy Simon Milner argued that: “One of the issues with this is that content from videos like this can be used by news organisations to highlight their activities. “With this material, context really matters. There is a chance that we are taking down important journalism.”

But Mr Loughton said: “This is not about taking away somebodys rights to criticise somebody whose politics they dont agree with. Its about not providing a platform – whatever the ills of society you want to blame it on – for placing stuff that incites people to kill, harm, maim, incite violence against people because of their political beliefs.”

“You are profiting from the fact that people use your platforms and you are profiting, Im afraid, from the fact that people are using your platforms to further the ills of society and youre allowing them to do it and doing very little, proactively to prevent them.”

Committee chairwoman Yvette Cooper said the three companies needed to do more on hate speech, and criticised Google for taking eight months to remove a racist video on YouTube that she had flagged to the company on numerous occasions. Even after it had been removed from YouTube, it was still accessible on Facebook and Twitter. Ms. Cooper said she found it “incomprehensible” that information had not been shared between the three companies.

She added that in the course of searching for the YouTube video, she was constantly recommended other abusive content. “Is it not simply that you are actively recommending racist material into peoples timelines?” she asked. “Your algorithms are doing the job of grooming and radicalising.”

Twitter also came under fire from Ms. Cooper over racist comments aimed at MP Diane Abbott and death threats aimed at MP Anna Soubry that had not been removed from the platform.

Twitters vice-president of public policy, Sinead McSweeney, was difficult to ensure no abusive tweets remained on the platform, saying: “You can clean a street in the morning and it can still be full of rubbish by 22:00.”

Googles vice-president of public policy Dr Nicklas Lundblad told the Committee that the company was using machine learning and AI to help it flag unsuitable content.

Whatever their individual efforts, however, as long as the three companies continue to work in isolation, MPs, and the public are likely to remain unconvinced that they are taking their responsibilities seriously enough.