Meta Platforms has introduced enhanced privacy and parental controls for Instagram users under 18, in response to growing concerns over the negative effects of social media on young users.
Under the new policy, all designated Instagram accounts for users under 18 will be automatically converted to "Teen Accounts," which will be set to private by default. Users with these accounts will only be able to receive messages and tags from accounts they already follow or are connected to, and the settings for sensitive content will be adjusted to the most restrictive level.
For users under 16, changes to these default settings will require parental permission. Parents will also gain access to new tools that allow them to monitor their children’s interactions and limit their time on the app.
This move comes in the wake of numerous studies linking social media use to increased rates of depression, anxiety, and learning disabilities among young people. It also follows a series of lawsuits filed against Meta, TikTok, and YouTube by parents and school districts, alleging that these platforms are addictive and harmful to children. In addition, 33 US states, including California and New York, have sued Meta for allegedly misleading the public about the risks of its platforms.
Earlier this year, the US Senate advanced two online safety bills — The Kids Online Safety Act and The Children and Teens' Online Privacy Protection Act — to hold social media companies accountable for the impact their platforms have on young users.
As part of the new updates, users under 18 will be encouraged to close the app after 60 minutes of daily use, and a default sleep mode will be activated to silence notifications overnight. Meta plans to implement these changes in the US, UK, Canada, and Australia within 60 days, with a rollout to the European Union later this year and a global release starting in January.
Comments
Post a Comment