Meta debuts privacy-focused Instagram accounts for teens

Meta has unveiled new Instagram accounts tailored for teens, aiming to boost privacy and parental controls. 

This initiative is part of Meta’s effort to reduce teens’ exposure to harmful content amid growing regulatory and public pressure.

The new accounts, which will automatically be set to private for users under 16, feature strict content filters to shield young users from inappropriate material. 

Only approved followers can message or tag these accounts. Additionally, Meta is introducing enhanced parental controls, allowing parents to oversee their children’s activity, restrict interactions, and manage screen time. Any changes to default settings will require parental approval.

This development comes as Meta, along with other social media giants like TikTok and YouTube, faces lawsuits alleging that their platforms contribute to mental health issues in young users. 

Recent studies and legal actions have highlighted concerns over social media’s impact on stress and cognitive development in teenagers.

 In response, lawmakers are pushing for stricter regulations, including the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act.

The new teen accounts follow Meta’s previous decision to cancel a separate Instagram app for teenagers, which faced significant opposition. The updated approach aims to address child safety concerns within the existing Instagram platform.

Meta will roll out these accounts over the next 60 days in key markets such as the U.S., UK, Canada, and Australia, with a broader European launch planned later in the year and global availability expected in January 2025.