Delhi | 25°C (windy) | Air: 185%

Meta to shield teens from harmful content on Instagram and Facebook

  • Nishadil
  • January 10, 2024
  • 0 Comments
  • 2 minutes read
  • 8 Views
Meta to shield teens from harmful content on Instagram and Facebook

Meta, the parent company of Instagram and Facebook, has announced that it will automatically filter out harmful content such as self harm, graphic violence, and eating disorders from teen accounts on its social media platforms. This move which was long coming, is expected to take effect in the next few weeks.

This is a major change the tech giant has made to ensure its younger and more suitable experience on its sites. The new content restrictions come amid a wave of lawsuits from more than 40 states, accusing Meta of deceiving the public about the risks that its platforms pose to young people. The state attorneys general, who filed a lawsuit against Meta in October, cited internal Meta documents that showed that the company designed its products to exploit young users’ vulnerability to peer pressure and potentially dangerous behavior.

Meta denied in November that it designed its products to be addictive for teens. Teen accounts, which are accounts of users under 18 years old based on the birth date they entered during sign up, will be automatically placed into the most restrictive content settings. Teens under 16 years old will not be exposed to sexually explicit content.

On Instagram, this setting is called Sensitive Content Control; on Facebook, it is called Reduce. Previously, teens could choose less strict settings. Teen users cannot opt out of these new settings. The new restricted status of teen accounts means that teens cannot see or search for harmful content, even if it is shared by a friend or someone they follow.

For instance, if a teen’s friend had been posting about dieting, those posts would no longer be visible to the teen. However, teens might still see content related to a friend’s recovery from an eating disorder. A company spokeswoman said teens will not necessarily know what they are missing because the content will not be available.

Meta says it consulted with experts in adolescent development to determine what types of content are inappropriate for teens. Meta said its algorithms already harmful content to teens in its video Reels and Explore page. With the new changes, such content will not be shown to teens in their Feeds and Stories.

The changes will be automatically applied to existing teen accounts starting this week. Newly created teen accounts will also be restricted to age appropriate content. When teens search for terms related to suicide, self harm, and eating disorders, Instagram and Facebook will hide related results and direct them to expert resources for help.

The company already hides results for suicide and self harm search terms that violate the platforms’ rules; now, Meta is extending the protection to include additional terms. Meta is also introducing a tool to make teens’ sharing settings more private on Instagram. A notification will appear that allows teen users to “turn on recommended settings” with one tap.

The notification pops up when the teen account is tagged by or has some other interaction with an unknown account..