How To Get Instagram’s New Features To Stop Abuse
Instagram has rolled out a new range of measures which aim to protect its enormous community from abuse today.
The platform, which has often come under fire for not being hot enough on cyberbullying, now has some key updates which could make a world of difference to Instagrammers.
There are now limits on who can contact you
Users can opt to use ‘Limits’, a feature which restricts comments and DM requests “during spikes of increased attention”.
The tool is easy to switch on and will automatically conceal comments and DMs from those who don’t follow you at all or who only recently started following your account.
Instagram’s research shows this is where most abuse comes from.
Limits will most likely benefit those who are suddenly making headlines, such as those competing at the Olympics.
The tool will not switch off comments and messages completely, and will allow users to remain connected to their long-standing followers.
It is available globally from this week, and can be switched on through your account’s privacy settings. The app is also exploring a way for Instagram to prompt users to turn the tool on.
A new warning will pop up before users leave offensive messages
There will now be “stronger warnings when people try to post potentially offensive comments”, according to the platform.
Instagram already shows a warning to users when they are about to post a potentially offensive comment on someone else’s account.
A message used to pop up reminding the user of their community guidelines when users tried to post the comment repeatedly. Instagram would also warn the user the comment could be removed in future.
Now, Instagram will show “this stronger message” on the user’s first attempt to post.
The platform maintains this method has seen approximately 50 percent of comments edited or deleted by the user when the stronger warning has popped up.
The expansion of the ‘Hidden Words’ folder
Instagram has announced its ‘Hidden Words’ feature which will now be available globally, after it was rolled out to a few countries earlier this year.
It means offensive words, phrases and emojis automatically filter into a hidden folder which users never have to open – spam messages also end up in here.
This will be encouraged for those accounts with a large following.
There is also a new opt-in feature called to hide other comments which may be abusive, even if they do not break the community rules, called ‘Hide More Comments’.
Why is this necessary?
The horrific online racist abuse directed at three black England football players, Marcus Rashford, Jadon Sancho and Bukayo Saka, after they took penalties in the Euro 2020 final left the UK reeling.
England’s Football Association appealed to social media bosses to “step up and take accountability and action to ban users from their platforms”.
It also urged them to “gather evidence that can lead to prosecution and support making their platform free of this type of abhorrent abuse”.
Cyberbullying towards the stars of Love Island has been a major cause for concern in recent. years as well, triggering ITV to launch its #BeKind campaign.
In a new blog post, Instagram signalled it was trying to turn a corner.
It acknowledged its “responsibility to make sure everyone feels safe” on its platform.
The government has drafted an Online Safety bill which would force tech giants to remove toxic or illegal content, or face heft fines of £18 million – or 10 percent of their annual global revenues – from Ofcom.