Last updated on April 4th, 2024 at 10:07 am

Tech giant warns that scanning for known child abuse material would compromise the privacy and safety of every user

Apple has cautioned against an Australian proposal requiring tech companies to scan cloud and messaging services for child abuse material, stating it could jeopardize fundamental privacy and security protections and potentially lead to mass surveillance with global implications. Proposed by the eSafety commissioner, Julie Inman Grant, two mandatory standards for child safety released last year suggest providers should detect and remove child abuse material and pro-terror material “where technically feasible,” while also disrupting and deterring new material of that nature.

The regulator has emphasized in a related discussion paper that it does not support incorporating vulnerabilities or backdoors to compromise privacy and security in end-to-end encrypted services.

Apple’s submission to the proposals, shared with Guardian Australia, argued that these assurances would not provide protection since they were not explicitly included in the draft standards.

“eSafety asserts that the same protections for end-to-end encryption in the codes also apply to the standards, but this is not confirmed by any language to that effect,” the submission stated.

“We suggest that eSafety adopt a clear and consistent approach by explicitly supporting end-to-end encryption to avoid any uncertainty, confusion, or potential inconsistency across codes and standards.”

The company also raised concerns that the definition of “technically feasible” was overly restrictive, emphasizing cost considerations for developing a new system rather than evaluating whether a specific product design change is in the best interest of user security.

These comments from the Cupertino-based company have been supported by privacy advocates and Signal, an encrypted messaging company. Signal has indicated it will legally challenge the standards if required to weaken encryption.

Apple also cautioned that mandating technology to scan cloud services for known child abuse material would jeopardize the privacy and security of all users.

“Scanning for specific content creates the potential for widespread surveillance of communications and storage systems containing data related to the most intimate aspects of many Australians’ lives,” Apple stated.

“These capabilities, as history has shown, are likely to expand to include other types of content (such as images, videos, text, or audio) and categories of content.”

Apple expressed concerns that such surveillance tools could be repurposed to search for additional content, including individuals’ political, religious, health, sexual, or reproductive activities.

Implementing tools of mass surveillance has broad and adverse implications for freedom of opinion and expression, which in turn impacts democracy as a whole.”

Apple also indicated that scanning individuals’ files and messages could enable law enforcement to bypass legal procedures. Compelling tech companies to adopt such practices would “have significant global consequences,” it noted.

Countries without the strong legal safeguards available to Australians will exploit and broaden this approach,” Apple stated.

Apple’s Director of User Privacy and Child Safety, Erik Neuenschwander, emphasized the importance of tech companies enhancing protections and minimizing vulnerabilities. He expressed concerns that the absence of encryption protections and the restrictive definition of technical feasibility might introduce vulnerabilities into systems.

Neuenschwander noted that scanning user data would impose a broad requirement, necessitating companies to have access to all data in a readable format for various purposes.