Instagram Rolling Out Moderation Tools to Combat Harassment

Published: July 31, 2016 9:34 PM /


Instagram Logo

Instagram, the popular photo-sharing platform, is rolling out new moderation tools in an effort to combat harassment. Until now, Instagram has relied on users to report abuse in order to remove content that violates the community guidelines. However, there is often disagreement as to what exactly should be considered abuse or harassment. A user may see comments that they find personally offensive, but Instagram does not consider it to be a violation of their community guidelines.

In order to deal with divergent opinions as to what constitutes harassment, Instagram has decided to give every user the ability to moderate their own posts, so each individual can remove content they find personally offensive. Early in July, a TechCrunch writer noticed that Instagram business accounts had the ability to turn on a comment moderation feature which automatically filters out comments with words and phrases "often reported as offensive." The new moderation tools, which will be available to all users and not just business accounts, will allow users to pick which words and phrases they want to be filtered rather than relying on a predetermined list.

Users will also have the option of turning off comments entirely. Turning off comments is done on a post by post basis, which means users can turn off comments on specific posts they think may attract negative comments, while still allowing comments on other posts. However, their will be no global setting to turn off all comments on an account, so if someone wants no comments at all they must turn it off for each individual post. The system Instagram is working on is different from platforms like YouTube, which allow users to manually remove specific comments, while this system employs an automated filter to remove content based on keywords. However, both platforms give users the ability to shut off comments.

Nicky Jackson Colaco, Instagram's head of public policy, told The Washington Post, "Our goal is to make Instagram a friendly, fun and, most importantly, safe place for self expression. We have slowly begun to offer accounts with high volume comment threads the option to moderate their comment experience. As we learn, we look forward to improving the comment experience for our broader community."

As Colaco's statement suggests, the new features will be available to high-profile accounts first, so they can be tested before a wider roll out. However, all users should have access to the new moderation tools in the next few months.

Are moderation tools for users a good move for Instagram? Leave your comments below.

Have a tip, or want to point out something we missed? Leave a Comment or e-mail us at

No author image supplied
| Senior Writer

I’m a technology reporter located near the Innovation District of Kitchener-Waterloo, Ontario.