Meta, the company behind Instagram, Facebook, and Messenger, has launched new safety rules to protect teens. The changes will prevent teens under 16 from livestreaming on Instagram unless their parents approve. This new policy is part of Meta’s broader efforts to improve safety for young users across its platforms.
New Restrictions for Teen Accounts
Meta’s update affects teens under 16. These teens must now get permission from their parents before using some features. For example, they can’t use Instagram Live without approval. They also can’t disable message filters that blur suspected nudity unless a parent allows it.
This is an extension of safety tools Meta introduced for Instagram in 2023. These tools help parents monitor their children’s activity. Some features include:
-
Daily Time Limits: Parents can set limits on how much time their teens spend on Instagram each day.
-
Lock Access During Certain Hours: Parents can block access to Instagram during specific times, like during school or bedtime.
-
Messaging Oversight: Parents can see who their teens are messaging on Instagram.
Now, these same features are coming to Facebook and Messenger. In the US, UK, Canada, and Australia, under-16s will need parental approval to make any changes. Teens aged 16 and 17 can make these changes on their own.
What Child Safety Groups Are Saying
While many welcome the new rules, some child safety groups believe more needs to be done. The NSPCC (National Society for the Prevention of Cruelty to Children) supports the update but wants Meta to focus on stopping harmful content from appearing in the first place.
Matthew Sowemimo, a spokesperson for the NSPCC, said, “These changes must go hand-in-hand with better efforts to keep dangerous content off the platforms.”
Meta reports that over 54 million teens use Instagram, with 90% of 13- to 15-year-olds keeping the default safety settings on. Still, many child safety groups want more proactive measures to prevent harmful material from reaching young users.
Stricter Online Safety Laws in the UK
These changes come as the UK’s Online Safety Act takes effect. This law requires platforms like Meta, Google, and Reddit to remove harmful content quickly. The law also forces platforms to protect users under 18 from dangerous material, including content about suicide and self-harm.
There has been concern that the UK-US trade deal could weaken the Online Safety Act. Child safety groups strongly oppose any attempts to weaken these protections.
Nick Clegg, Meta’s president of global affairs, explained that the company’s goal is to help parents. He said, “We want to shift the balance in favor of parents.” However, he also admitted that many parents don’t use the available controls.
How Meta Is Responding
Meta’s update signals its desire to do more to protect teens. The new rules for Instagram, Facebook, and Messenger will give parents more control over their children’s online behavior.
But critics argue that Meta could do more to prevent harmful content before it even reaches users. They believe the company should be more proactive in monitoring content and ensuring that dangerous material is removed right away.
Meta’s Nick Clegg has said the company is committed to improving the safety of teens online. He pointed out that many parents don’t use the safety tools that are already available to them. He stressed that Meta is focused on helping parents stay in control of their children’s online experiences.
What’s Next for Teen Safety on Social Media?
As Meta continues to roll out new features, it is clear that teen safety will remain a top priority. However, as more laws like the UK’s Online Safety Act take effect, Meta and other platforms will face increasing pressure to improve their efforts.
Meta’s new rules are a good start, but critics believe the company should go even further. Platforms need to act faster to remove harmful content and prevent it from reaching vulnerable users in the first place. The new rules for livestreaming, message filters, and parental controls are a step in the right direction, but the fight for safer social media continues.
Meta’s latest changes to Instagram, Facebook, and Messenger are a response to growing concerns about online safety. By adding more tools for parents and making teens ask for permission before using certain features, Meta is taking steps to protect young users. However, many child safety advocates believe the company needs to do more to prevent harmful content from reaching teens in the first place.