Ofcom to be given new powers to regulate social media firms over harmful content
- Wednesday, February 12th, 2020
- Share this article:
Broadcast and telecoms regulator Ofcom is to gain new powers to force social media firms to take action over harmful content on their platforms, the BBC reports. The new plans will be announced by the government later today, broadening Ofcom’s scope beyond the media to online safety. It will mark a significant shift in the regulatory approach towards social media firms, which to date have been largely been self-regulating.
The move marks the governments first response to the Online Harms consultation it carried out in the UK last year, which received 2,500 replies. The new rules will apply to firms hosting user-generated content, including comments, forums and video-sharing, which in practice means all the big social platforms.
According to the report, the intention is that government will set the direction of the policy but Ofcom will have the freedom to draw up the details, and adapt them in response to new online threats that emerge.
Calls for social media firms to take more responsibility for the content on their platforms have increased since the death of teenager Molly Russell, who took her own life in 2017 after viewing graphic content on Instagram. Molly’s father Ian subsequently said he believed Instagram was partly responsible for his daughter’s death.
Scott Morrison, Director at Berkeley Research Group, believes Ofcom may not find its new, extended role plain sailing. He said: “Ofcom’s appointment as the Internet Regulator throws up some interesting challenges for Ofcom. Ofcom has an analogous role in regulating broadcast TV and Video-on-Demand services, and so should have transferable skills and people to take on the new role. However, the role of policing the internet is ultimately more challenging than regulating broadcasting services. Online firms operate internationally, and Ofcom may face jurisdictional issues in attempting to regulate these online firms. Furthermore, the sheer volume of online content is vast. For example, it is estimated that over 500 hours of content are uploaded to YouTube every minute. For online firms or Ofcom to police this content cannot be undertaken by humans alone, and will require some form of AI.
“For Ofcom to be an effective regulator, it will require the appropriate powers to implement tackling Online Harms. To be effective, Ofcom will need to be able to impose sufficient sanctions so online firms comply with the codes of conduct.”
And Chris Daly, chief executive of the Chartered Institute of Marketing, said the announcement is good news for the marketing industry.
“It’s important that this critical role is handed to an established and credible body such as Ofcom,” Daly said. “Digital channels have completely changed the way the marketing sector operates as the use of social media has grown. Ofcoms new powers will give businesses greater confidence that they can invest in social media marketing without inadvertently supporting platforms where users are at risk.
“Our recent research uncovered an alarming level of social media users, including children, seeing inappropriate and harmful online content and led us to call on the Government to step in and regulate.
“Protecting people online isn’t only about regulation, however. The public, and children in particular, need to know what they can do to protect themselves and how and why they should be reporting harmful content, something only a small proportion of people currently do.
“We believe that the media literacy strategy due to be published in the Summer of 2020 is going to be critical to making regulation a success and hope it will be accompanied by a significant marketing campaign to raise awareness of online safety.”