YouTube removed 8m offensive videos in Q4 2017, PCs flagged most of them
- Tuesday, April 24th, 2018
- Share this article:
Not to be outdone by Facebook, YouTube has released the first of a regular series of quarterly reports on how it is enforcing its Community Guidelines. It is also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged for review against the Community Guidelines.
The first report, covering October – December 2017, reveals that during this time, over 8m videos were removed from YouTube, the vast majority of which were spam or people attempting to upload adult content. The role of PCs in combatting offensive material is highlighted by the fact that 6.7m of these were first flagged for review “by machines rather than humans” as YouTube puts it. 76 per cent of those 6.7m videos were removed before they had been viewed by any YouTube users.
Additionally, YouTube notes that at the beginning of 2017, 8 per cent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 Now, following the introduction of “machine learning flagging” in June 2017, more than half of the videos YouTube removes for violent extremism have fewer than 10 views.
YouTube also said it has staffed the majority of additional roles needed to reach its contribution to meeting the goal of having 10,000 people working to address violative content across Google by the end of 2018. Among its hires are full-time specialists with expertise in violent extremism, counter-terrorism, and human rights.