Instagram has introduced a new notification process to better explain to users why their account could be at risk of being disabled.
In a post on its blog, Instagram states that the notification will also offer users a chance to appeal when their content is deleted for violations of its "nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies".
If the content is found to be removed in error, Instagram will restore the post and remove the violation from the account's record.
"We've always given people the option to appeal disabled accounts through our Help Center, and in the next few months, we'll bring this experience directly within Instagram," it reveals.
The notification and appeal process is part of a series of changes to Instagram's account disable policy.https://twitter.com/instagram/status/1151885060863213568
Currently, the photo and video-sharing social network only disables accounts that have a certain percentage of violating content. However, under its new policy, it will also remove accounts with a certain number of violations within a window of time.
"Similarly to how policies are enforced on Facebook (Instagram's parent company), this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram," it says.
It adds that the updates are an important step in improving its policies and keeping the platform a safe and supportive place.