Fourteen of the world’s most respected information security experts have warned that client-side scanning “is a dangerous technology” that threatens democracy.
The cryptographers and engineers, whose careers have laid the groundwork for the internet’s fundamental security protocols, have authored a paper titled “Bugs in Our Pockets” to explain their concerns,
They say that introducing the technology – recently proposed by Apple as a method of pre-emptively scanning all iPhones for child sexual abuse material (CSAM) – “would be an extremely dangerous societal experiment” that risks handing governments enormous surveillance powers.
What is the technology?
Client-side scanning (CSS) is a way of searching for particular files on a personal device without those files having to be shared with someone else, as happens with server-side scanning.
It is intended to protect users’ privacy by preventing other people from seeing innocent images, but the experts argue these protections are not guaranteed.
CSS was the model proposed by Apple for its “critically important child safety features” which were to be launched in the US later this year before being delayed following concerns and controversy.
“The ability of citizens to freely use digital devices, to create and store content, and to communicate with others depends strongly on our ability to feel safe in doing so. The introduction of scanning on our personal devices — devices that keep information from to-do notes to texts and photos from loved ones — tears at the heart of privacy of individual citizens. Such bulk surveillance can result in a significant chilling effect on freedom of speech and, indeed, on democracy itself.”
– Bugs in our Pockets: The Risks of Client-Side Scanning, Abelson et al.
Apple said its feature was designed with privacy protections that would ensure it was “limited to detecting CSAM stored in iCloud”.
The company added that it would refuse government demands to use the system to search for images in other criminal or national security investigations.
But the researchers warn – of CSS as a design in general, and not just in regards to Apple’s plans – that “even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope” for other purposes.
Is this about ‘back door’ access for police?
As a system for searching for files on a device, CSS offers tech companies and law enforcement a solution to the ongoing debate about encryption and public safety.
It ostensibly allows users to keep their data private while empowering police to investigate child abuse cases without creating a so-called “back door” that could be abused criminals – though the researchers warn CSS may still be abused.
Their main argument is that the introduction of CSS systems “would be much more privacy invasive than previous proposals to weaken encryption” because of the powers it hands to state authorities.
“Rather than reading the content of encrypted communications, CSS gives law enforcement the ability to remotely search not just communications, but information stored on user devices.”
Referencing previously suggested solutions to the encryption debate, they wrote: “The proposal to pre-emptively scan all user devices for targeted content is far more insidious than earlier proposals for key escrow and exceptional access.
“Instead of having targeted capabilities such as to wiretap communications with a warrant and to perform forensics on seized devices, the agencies’ direction of travel is the bulk scanning of everyone’s private data, all the time, without warrant or suspicion.
“That crosses a red line. Is it prudent to deploy extremely powerful surveillance technology that could easily be extended to undermine basic freedoms?” they ask.
The paper was written by security and cryptograhy experts Hal Abelson, Ross Anderson, Steven Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter Neumann, Ronald Rivest, Jeffrey Schiller, Bruce Schneier, Vanessa Teague, and Carmela Troncoso.