Many teenagers have seen posts on social media that they believe shouldn’t be allowed. However, very few of them actually report the content through the relevant channels, making it even more difficult for social media firms to deal with inappropriate content on their platforms.
According to a Chartered Institute of Marketing (CIM) survey of 2,032 adults and 550 teens aged 13 to 17, 95 per cent of teens have a social media account, with the most popular being YouTube (79 per cent). This is followed by Instagram (73 per cent), Snapchat (66 per cent), and Facebook (45 per cent).
On these social media platforms, 46 per cent of young people have seen posts that they believe shouldn’t be allowed. Despite this, 62 per cent rarely or never report these posts, and just seven per cent say they always do.
“It is alarming that so many children have seen inappropriate posts on social media and failed to report them. Moreover, while more adults do report harmful content, it is concerning that only one in five always do so,” said Leigh Hopwood, chair of the Chartered Institute of Marketing. “Our research shows that we could make a huge difference quickly if we all take the simple action of hitting the report button when we see something that shouldn’t be on social media. When the new regulations take effect then social media companies will have a legal responsibility to do something about it once we have reported it.
“We are calling for a public education campaign to show people, especially children, how to report harmful content and to highlight the importance of reporting it whenever you see it. We don’t believe we should wait for the regulations, this is something that can happen now.”
Seeing posts of this nature discourages 44 per cent of children from engaging online, but 66 per cent say that seeing inappropriate posts on social media would not make them delete their account, while 52 per cent would not be put off signing up for an account in the first place.
Amongst adults, 76 per cent believe it is the responsibility of parents/guardians to protect children on children, with 74 per cent see social media companies bearing responsibility. However, when it comes to actually monitoring harmful content, 83 per cent say the responsibility rests with social media companies, while 57 per cent think it’s the role of the individual and 49 per cent saying governments.
It’s a similar story when adults are asked who should pay to deal with harmful content on social media. 67 per cent of people over 18 saying social media companies should cover the cost of monitoring and regulating negative content, compared to only 14 per cent saying the government should be responsible.