In order to keep a check on the content and to ensure the safety of minors, Instagram announced new policies aimed at protecting teenagers on the platform.
In a blog post titled “Continuing to Make Instagram Safer for the Youngest Members of Our Community”, the Facebook-owned platform said that it was blocking adults from sending Direct Messages to any teenager that does not follow them.
From now onwards, Instagram will start warning teens to be cautious while they talk to adults they are already in contact with on the platform. This will include safety notices (in the form of banners) inside the app that will warn users about an adult that has been “exhibiting potentially suspicious behaviour”, such as sending a large number of message requests to other minors.
“For example, if an adult is sending a large amount of friend or message requests to people under 18, we’ll use this tool to alert the recipients within their DMs and give them an option to end the conversation, or block, report, or restrict the adult,” the company said adding that some unspecified countries would see the feature roll out this month.
Instagram will also restrict adults who have been exhibiting potentially suspicious behaviour from seeing new minor’s accounts in 'Suggested Users', Reels or Explore. The company will also automatically hide their comments on public posts by teens and it will also push minors and younger users to opt for a private profile by explaining all the benefits involved.
The company has also announced that it would use machine learning and artificial intelligence in order to figure out what a user's age was, to protect them on the platform. It has added that these changes are required keeping in mind the fact that it is adding end-to-end encryption for all its chat platforms in the future, which also includes Instagram Direct Messages.