Now Reading
Meta Will Restrict Teens From Viewing Harmful Content

Meta Will Restrict Teens From Viewing Harmful Content

Facebook and Instagram are restricting young users from from viewing topics like suicide, self-harm, and eating disorders on their platforms.

The company announced that the content, which Meta says may not be “age appropriate” for young people, will not appear in a teen’s feed even if it’s shared by someone they follow. Additionally, if a teen search for this type of content on the platforms, they’ll be directed to “expert resources” like the National Alliance on Mental Illness.

Meta is rolling out this change to users under 18 over the next several months. In addition to hiding sensitive content, teens’ accounts will default to restrictive filtering settings that limit what kind of content they can see and engage with.

The changes come as Meta and other tech companies are being scrutinized by U.S. lawmakers for not providing adequate care to young users’ mental health. Numerous studies have shown that there is a strong correlation between social media use and negative mental health. Earlier this week, Florida lawmakers introduced a bill that would restrict anyone under the age of 16 from using social media platforms.

Later this month, Meta CEO Mark Zuckerberg — along with a handful of other tech CEOs — will testify before the Senate on child safety.

© 2023 RELEVANT Media Group, Inc. All Rights Reserved.

Scroll To Top

You’re reading our ad-supported experience

For our premium ad-free experience, including exclusive podcasts, issues and more, subscribe to

Plans start as low as $2.50/mo