Last updated on March 21st, 2024 at 08:18 am

The action is taken as global regulators urge Meta to shield children from inappropriate content on its apps

On Tuesday, Meta announced its decision to conceal more sensitive content from teenagers using Instagram and Facebook, responding to global regulatory pressure urging the social media company to safeguard children from harmful content on its platforms.

The change aims to reduce teenagers’ exposure to sensitive content like suicide, self-harm, and eating disorders when utilizing features such as search and explore on Instagram. Meta stated that all teenage accounts will default to the most restrictive content control settings on Instagram and Facebook. Additionally, Instagram will restrict additional search terms, as outlined in Meta’s blog post.

“Our goal is for teenagers to have safe, age-appropriate experiences on our platforms,” the blog post states. “Today, we’re introducing further safeguards that specifically target the content teenagers encounter on Instagram and Facebook.”

Even if a teenager follows an account that posts about sensitive topics, those posts will be excluded from the teenager’s feed, according to Meta’s blog. The company stated that these measures, expected to be implemented over the next few weeks, will contribute to a more “age-appropriate” experience.

“For instance, consider someone sharing their ongoing struggle with self-harm ideation. This is a significant narrative that can help destigmatize these issues, but it’s a nuanced subject and may not be suitable for all young individuals. From now on, we will begin to filter out this kind of content from teenagers’ experiences on Instagram and Facebook,” the company’s blog post explains.

Meta faces scrutiny in both the United States and Europe amid accusations that its apps are addictive and have contributed to a mental health crisis among youth. In October, attorneys general from 33 US states, including California and New York, filed a lawsuit against the company, alleging that it repeatedly misled the public about the risks associated with its platforms. In Europe, the European Commission has requested information on Meta’s measures to protect children from illegal and harmful content.

This regulatory pressure came after a former Meta employee, Arturo Bejar, testified before the US Senate, claiming that the company was aware of harassment and other dangers teenagers faced on its platforms but did not take appropriate action.

Bejar urged the company to implement design modifications on Facebook and Instagram to encourage users to engage in more positive behaviors and to offer improved tools for young individuals to handle negative experiences. Bejar disclosed that his own daughter had encountered unwanted advances on Instagram, a concern he raised with Meta’s senior leadership. However, according to his testimony, Meta’s senior executives disregarded his appeals.

Businesses have traditionally targeted children as a desirable demographic, aiming to cultivate them as consumers during their impressionable years and establish brand loyalty.

In recent years, Meta has been locked in a heated rivalry with TikTok for the attention of young users. The company sees teenagers as a potential means to attract more advertisers, as businesses aim to establish brand loyalty in young consumers, hoping they will continue purchasing their products as they mature.