Instagram will kindly let you know you’ve F’ed up before banning you

Yolanda Curtis
July 20, 2019

As for posts which have been removed in error, Instagram will repost it and also remove violation from the account.

The new changes are created to "quickly detect and remove accounts that repeatedly violate our policies". If the content was proven to be within community guidelines, it will then be restored and the user's account will no longer have a violation record.

According to Instagram, appeals will be initially available for content that was deleted on the grounds of "nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism", but it will be expanding in the coming months. Instagram will now send notifications to accounts that are in danger of being deleted, with the option to appeal the violations.

Instagram has made changes to its policy about removing accounts.

Facebook-owned Instagram is stepping up its policies when it comes to users violating its rules, announcing that it will now alert people when their account is at risk of being deleted. In the past, Instagram had flag accounts that violated some of its rules to a certain extent.

The notice will list all the posts and comments that have been removed for violating the terns of service, and inform you that "If you post something that goes against our terms again, your account may be deleted, including your posts, archive, messages and followers".

Instagram's account disabling policies are now about to get far more transparent with a change to its account policies.

Like its parent company, Facebook, Instagram also has problem moderating nudity and sexuality. While the update will not decrease the mistakes, it will make it easier for users to appeal to the platform about its decision a little easier.

Instagram has recently rolled out a good number of new updates, including its "restrict" feature and recently, the "like-hiding" feature.

Other reports by iNewsToday