Instagrams latest tool asks trolls if theyre sure about posting that offensive comment

Instagram has introduced an anti-bullying and harassment tool that it hopes will curb abusive behaviour on its platform, as the Facebook-owned app continues to face growing calls to do more to stop people being targeted with abuse.

The tool will aim to prevent bullying by making use of artificial intelligence (AI) to detect bullying and other harmful content in comments, photos, and videos. When a questionable piece of content has been detected the app will let the user know that their post may be considered offensive and ask if they still wish to upload the comment.

According to Instagram boss Adam Mosseri, the app found in early tests that the tool “encourages some people” to not post their comment and choose to share a “less hurtful” one instead.

“In the last few days, we started rolling out a new feature powered by AI that notifies people when their comment may be considered offensive before it’s posted,” said Mosseri. “This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification.”

Instagram is introducing the tool after years of pressure to deal with bullying and dangerous content after widely reported cases, such as the unfortunate suicide of UK teenager Molly Russell in 2017, which her father, Ian Russell, said Instagram had “helped” toward.

The popular social media app will also soon begin trialling a way for users to protect their accounts from negative comments called ‘Restrict’. This will enable users to filter abusive comments from fellow users, making these only appear to the person who has posted them. A restricted person will not know they have been restricted and won’t be able to see when the person that has restricted them is active or when they have read their direct messages.

“We’ve heard from young people in our community that they’re reluctant to block, unfollow, or report their bully because it could escalate the situation, especially if they interact with their bully in real life,” Mosseri gave as a reason for the tool. “Some of these actions also make it difficult for a target to keep track of their bully’s behaviour.”

Array