Instagram is updating its account disable policy to be more consistent with Facebook’s guidelines and procedures, according to a blog post by the social media app. Instagram hopes these updates will make detecting and deleting violating content a faster and more efficient process, so that the platform can focus on being a “supportive place for everyone.”
Before the updates, Instagram’s policy was to remove accounts that have posted a certain percentage of violating content. While the app will still follow that rule, it will now also be disabling accounts that have incurred a certain number of violations in a specific window of time.
“Similarly to how policies are enforced on Facebook, this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram,” said Instagram.
While Instagram will be making its policy stricter, it will also be implementing a new notification system for when an account is in danger of being removed. Along with receiving a warning notification, account holders will be able to appeal their violations via the online Help Center. Eventually, Instagram will allow appeals to come straight from the app.
Starting now, content that has been removed for violating nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies can be appealed, with more categories coming soon. If content is found to be removed in error, the post will be restored, and the account will have that violation removed from its history.