Josh Taylor 

Apple warns Australian proposal to force tech companies to scan cloud services could lead to mass surveillance

Scanning for known child abuse material would compromise the privacy and safety of every user, tech giant says
  
  

A woman looks at her mobile phone as she walks past advertising for the new iPhone 11 Pro smartphone at an Apple store in Hong Kong.
‘Tools of mass surveillance have widespread negative implications for freedom of opinion and expression and, by extension, democracy as a whole,’ Apple has said in response to an eSafety commissioner proposal. Photograph: Philip Fong/AFP/Getty Images

Apple has warned an Australian proposal to force tech companies to scan cloud and messaging services for child-abuse material risks “undermining fundamental privacy and security protections” and could lead to mass surveillance with global repercussions.

Under two mandatory standards aimed at child safety released by the regulator last year, the eSafety commissioner, Julie Inman Grant, proposed that providers should detect and remove child-abuse material and pro-terror material “where technically feasible” – as well as disrupt and deter new material of that nature.

The regulator has stressed in an associated discussion paper it “does not advocate building in weaknesses or back doors to undermine privacy and security on end-to-end encrypted services”.

In Apple’s submission to the proposals, provided to Guardian Australia, it said this would offer no protection, given the assurances were not explicitly included in the draft standards.

“eSafety claims the same protections for end-to-end encryption in the codes apply to the standards but this is not supported by any language to that effect,” the submission said.

“We recommend that eSafety adopt a clear and consistent approach expressly supporting end-to-end encryption so that there is no uncertainty and confusion or potential inconsistency across codes and standards.”

Apple’s submission to the eSafety commissioner’s proposed online safety standards.

The company also flagged that the definition of what might be “technically feasible” was too narrow and was focused on the cost to develop a new system, without considering “whether a particular product design change is in the best interests of securing its users”.

The Cupertino-based company’s comments have been echoed by privacy advocates as well as encrypted messaging company Signal, which has flagged it will challenge the standards in court if forced to weaken encryption.

Apple also warned that a requirement for technology to scan cloud services for known child-abuse material would compromise the privacy and safety of every user.

“Scanning for particular content opens the door for bulk surveillance of communications and storage systems that hold data pertaining to the most private affairs of many Australians,” Apple said.

“Such capabilities, history shows, will inevitably expand to other content types (such as images, videos, text, or audio) and content categories.”

Apple said such surveillance tools could be reconfigured to search for other content, such as a person’s political, religious, health, sexual or reproductive activities.

“Tools of mass surveillance have widespread negative implications for freedom of opinion and expression and, by extension, democracy as a whole.”

The company also suggested that scanning people’s files and messages could lead to law enforcement circumventing the legal processes. Forcing tech companies to implement it would “have far-reaching global repercussions”, it said.

“Countries that lack the robust legal protections afforded to Australians will leverage this and expand on it,” Apple said.

Apple’s director of user privacy and child safety, Erik Neuenschwander, said tech companies should be improving protections and reducing vulnerabilities and that the lack of protections for encryption and the narrow definition of technical feasibility could create weaknesses in systems.

Neuenschwander said that scanning user data was a “wide-ranging requirement” and would require companies to be in possession of all data in a readable form for all sorts of purposes.

“That could include everything from the company’s own processing, to law enforcement requests, to potentially attackers getting into the systems and getting that data illicitly. And that is part of our concern around the lack of support for encryption.”

The company pointed to its various parental control functions as part of the work the company has undertaken around child safety.

Apple has not gone as far as to threaten to withdraw iMessage or iCloud from Australia, as it did in the UK when a similar online safety law was proposed – and ultimately shelved – last year.

Inman Grant told Senate estimates last week that there were a “lot of technical issues [and] a lot of good feedback” in the 50 submissions received during the consultation on the proposal.

“We will incorporate what we can and what we think makes sense and provides greater clarity,” she said.

Other submissions were expected to be published this month, while the finalised standards were likely go to parliament for approval by May.

 

Leave a Comment

Required fields are marked *

*

*