New Protections to Give Teens More Age-Appropriate Experiences on Our Apps
- We will start to hide more types of content for teens on Instagram and Facebook, in line with expert guidance.
- We’re automatically placing all teens into the most restrictive content control settings on Instagram and Facebook and restricting additional terms in Search on Instagram.
- We’re also prompting teens to update their privacy settings on Instagram in a single tap with new notifications.
We want teens to have safe, age-appropriate experiences on our apps. We’ve developed more than 30 tools and resources to support teens and their parents, and we’ve spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive. Today, we’re announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.
New Content Policies for Teens
We regularly consult with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens.
Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people. Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content. We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow.
“Meta is evolving its policies around content that could be more sensitive for teens, which is an important step in making social media platforms spaces where teens can connect and be creative in age-appropriate ways. These policies reflect current understandings and expert guidance regarding teen’s safety and well-being. As these changes unfold, they provide good opportunities for parents to talk with their teens about how to navigate difficult topics.” – Dr. Rachel Rodgers, Associate Professor, Department of Applied Psychology, Northeastern University
We want people to find support if they need it, so we will continue to share resources from expert organizations like the National Alliance on Mental Illness when someone posts content related to their struggles with self-harm or eating disorders. We’re starting to roll these changes out to teens under 18 now and they’ll be fully in place on Instagram and Facebook in the coming months.
Here’s more detail on how today’s updates expand on our existing protections, in line with feedback from experts:
“Parents want to be confident their teens are viewing content online that’s appropriate for their age. Paired with Meta’s parental supervision tools to help shape their teens’ experiences online, Meta’s new policies to hide content that might be less age-appropriate will give parents more peace of mind.” – Vicki Shotbolt, CEO, ParentZone.org
Updates to Instagram ’s and Facebook’s Content Recommendation Settings for Teens
We’re automatically placing teens into the most restrictive content control setting on Instagram and Facebook. We already apply this setting for new teens when they join Instagram and Facebook and are now expanding it to teens who are already using these apps. Our content recommendation controls — known as “Sensitive Content Control” on Instagram and “Reduce” on Facebook -– make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore.
Hiding More Results in Instagram Search Related to Suicide, Self-Harm and Eating Disorders
While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find. Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help. We already hide results for suicide and self-harm search terms that inherently break our rules and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks.
Prompting Teens to Easily Update Their Privacy Settings
To help make sure teens are regularly checking their safety and privacy settings on Instagram, and are aware of the more private settings available, we’re sending new notifications encouraging them to update their settings to a more private experience with a single tap. If teens choose to “Turn on recommended settings”, we will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels Remixes. We’ll also ensure only their followers can message them and help hide offensive comments.