Instagram added a new way to help users better manage their experience on the platform with a variable ‘Control of sensitive content‘, which offers three options to restrict the content displayed in the app.
As you can see here, the new sensitive content control options are now available in the latest version of the app.
If you go to Settings> Account> Sensitive content controls, you will now be able to choose between these options:
- Allow – You can see more photos or videos that might be disturbing or offensive
- Limit (default) – You can view photos or videos that might be disturbing or offensive
- Limit even more – You can see fewer photos or videos that might be disturbing or offensive
The medium term “Limit” is the default setting for all users, while only those over 18 will be able to select the “Allow” option, which removes any restrictions on the content displayed.
To be clear, Instagram notes that it already has various rules and processes in place to protect users from offensive content, with specific settings in place for both your regular feed / Stories and for Explore.
“We do not allow hate speech, bullying and any other content that may present a risk of harm to people. We also have rules about the type of content we show you in places like Explore; we call them our Recommendations Guidelines. These guidelines were designed to ensure that we do not show you sensitive content from accounts that you do not follow. You can think of sensitive content as messages that do not necessarily break our rules, but which could potentially bother some people, such as messages that may be sexually suggestive or violent. “
So it’s about providing an extra level of protection for users who may not want to see this type of content, as Instagram’s systems are now able to automatically detect certain types of content and then keep it out of sight. for those who choose their sensitivity settings.
Instagram has advanced its systems on this front over time. In 2019, Instagram explained how its image recognition systems were increasingly able to identify content which almost violated its Community directives, but did not quite cross the line.
This borderline content will often be less significant, as part of Instagram’s efforts to protect users from exposing offensive content in the app – but to ensure it doesn’t wrongly penalize creators, it must ensure that its systems have a high level of accuracy in detecting these elements in downloaded messages.
To do this, Instagram’s content moderators have tagged limit content in their regular job, which Instagram then uses to train its AI systems. This process, over time, has better enabled the platform to limit the reach of limit material, while it is also now advanced to the stage where Instagram can give users more control options, based of this advanced understanding of the system.
Which will not always be right. As with any AI system, there will be some false positives, but if you’re looking to avoid this type of material, this could be an easy way to limit exposure and improve your in-app experience.
And given the younger focus of Instagram’s audience and the reach they see now, it’s important for Instagram to protect its audience where it can – although that’s a point to note in your marketing approach, especially if you’re looking to push the boundaries. , or if your visuals could, potentially, be wrongly identified as offensive, based on the aforementioned categories.
The new sensitive content control options are now available in the latest version of the app.