Last updated on March 23rd, 2024 at 10:24 am

The feature will function even on encrypted messages, indicating the platform’s intention to introduce client-side scanning

Instagram will start scanning messages sent to and from under-18s to protect them from “inappropriate images,” Meta announced. The feature, which will be revealed later this year, will even work on encrypted messages, indicating the company plans to introduce client-side scanning for the first time.

However, the update will not fulfill contentious requests for inappropriate messages to be reported back to Instagram servers. Instead, only a user’s personal device will know whether a message has been filtered out, leading to criticism that the promise is another instance of the company “grading its own homework.”

This is the latest move by Meta to address concerns that its plans to encrypt direct messages on Facebook Messenger and Instagram could potentially endanger children and young people.

The general description of the planned feature is reminiscent of a setting called “communication safety” introduced by Apple in 2023. This setting detects nude photos and videos sent to children’s devices, automatically blurs them, and gives the child the option to view them or contact a trusted adult.

However, these plans do not go as far as the more robust forms of client-side scanning advocated by children’s safety groups. These groups have called for measures that would more actively report inappropriate messages to the service’s moderators, allowing for the tracking and apprehension of repeat offenders.

As Meta confronts a lawsuit in New Mexico alleging its failure to safeguard children on its platforms, recent legal filings indicate that around 100,000 children using Facebook and Instagram experience online sexual harassment daily.

Mark Zuckerberg, Meta’s CEO, is scheduled to testify before the US Congress on Wednesday alongside other social media leaders in a hearing concerning child safety.

In addition to the pledge of forthcoming scanning tools, Instagram has introduced a few immediate updates to enhance teenager safety on the platform. Users under 19 will now have privacy settings set by default to prevent receipt of direct messages from non-followers. Previously, this restriction only applied to adults messaging teenagers.

New features are being introduced for parents using the service’s “supervision” tools, enabling them to link their Instagram accounts to their children’s accounts to set time limits, monitor blocked contacts, and receive notifications of setting changes. Parents will now be prompted to actively approve or deny requests from children under 16 to adjust safety settings.

“As with all our parental supervision tools, this new feature aims to encourage offline discussions between parents and their teens as they navigate their online lives together and determine what’s best for them and their family,” Meta stated.

Arturo Béjar, a former senior engineer and consultant at Meta, emphasized the importance of providing regular updates on the number of unwanted advances teenagers receive on Instagram alongside these changes. Without such data, it would be challenging to assess the impact of safety updates. According to Béjar’s own research on Instagram users in 2021, one in eight children aged 13-15 on the platform had experienced unwanted sexual advances.

“This is another promise where Meta will evaluate its own progress,” he stated. “Without quarterly reports on unwanted advances as reported by teenagers, how can we verify that they’ve kept their promise or assess its impact? Even now, after more than two years of Meta being aware that one in eight kids receives unwanted advances each week, there is still no way for a teenager to flag or report such advances.”