After facing a lot of criticism for poor content moderation, Instagram is updating its policy for removing a user’s account, along with a new way to appeal their decision.
Instagram users now should keep count of their violations since the photo-sharing platform has changed their policy to remove accounts.
- Users with a certain number of violations within a window of time will now lose their Instagram account.
- Previously, accounts with a certain percentage of content violating the policies of Instagram were used to be removed.
- Also, the company will delete accounts right away if they violate the company’s drug sales or sexual solicitation policies.
- The Facebook-owned photo-sharing app has brought in the new policy in the same week that a man suspected of killing a 17-year-old girl posted a photo of her bloody body on what appeared to be his Instagram account.
- The image reappeared after getting deleted from the site which raised certain questions on how well the social media platform moderate their content.
- “Users will be notified if their account is at the risk of being removed along with a way to appeal the company’s decision” – Instagram.
- Users are allowed to appeal if their content is taken down for violating Instagram’s policies against nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism.
- Instagram is already facing criticism for their weak efforts to fight bullying or self-harm content.
By: Ishant Chaudary, Editorial Desk, DKODING Media