Meta Platforms is introducing improved privacy and parental controls for Instagram users under 18 in response to increasing concerns about the negative impact of social media.
Starting Tuesday, Meta will automatically convert all eligible Instagram accounts to “Teen Accounts,” which will be set to private by default.
These accounts will only receive messages and tags from people they follow or are connected to, and sensitive content settings will be set to the highest level of restriction.
Users under 16 will need parental approval to change these default settings. Parents will also have access to tools to monitor their children’s interactions and limit app usage.
This move comes amid ongoing research linking social media use to higher rates of depression, anxiety, and learning difficulties among young people.
Meta, along with ByteDance’s TikTok and Google’s YouTube, faces numerous lawsuits over the addictive nature of social media. Last year, 33 U.S. states, including California and New York, sued Meta for allegedly downplaying the risks associated with its platforms.
Meta’s decision follows the abandonment of a separate Instagram app for teenagers, which was scrapped three years ago due to safety concerns raised by lawmakers and advocacy groups. Recently, the U.S. Senate advanced two bills aimed at holding social media companies accountable for their impact on children and teens.
As part of the new update, under-18 users will be prompted to close the app after 60 minutes each day and will have a default sleep mode to silence notifications overnight.
Meta plans to implement these changes for users in the U.S., UK, Canada, and Australia within 60 days, with a rollout in the European Union later this year and globally starting in January.