Apple pushed back against criticism that its new anti-child sexual abuse detection system could be used for “backdoor” surveillance. The company insisted it won’t “accede to any government’s request to expand” the system’s scope.
The new plan, announced last week, includes a feature that identifies and blurs sexually explicit images received by children using Apple’s ‘Messages’ app – and another feature which notifies the company if it detects any Child Sexual Abuse Material (CSAM) in the iCloud.
The announcement sparked instant backlash from digital privacy groups, who said it “introduces a backdoor” into the company’s software that “threatens to undermine fundamental privacy protections” for users, under the guise of child protection.
Also on rt.com
In an open letter posted on GitHub and signed by security experts, including former NSA whistleblower Edward Snowden, the groups condemned the “privacy-invasive content scanning technology” and warned that the features have the “potential to bypass any end-to-end encryption.”
Apple has published an FAQ regarding their CSAM updates. It may be time to revisit the question of how much does metadata reveal about data, and whether this explanation is being too cute about the letter/spirit of E2EE. https://t.co/gv0uP17S6w pic.twitter.com/6GgLvynazb
— Jeffrey Vagle (@jvagle) August 9, 2021
After an internal memo reportedly referred to the criticism as the “screeching voices of the minority,” Apple on Monday released an FAQ about its ‘Expanded Protections for Children’ system, saying it was designed to apply only to images uploaded to iCloud and not the “private iPhone phone library.” It also will not affect users who have iCloud Photos disabled.
The system, it adds, only works with CSAM image hashtags provided by the National Center for Missing and Exploited Children (NCMEC) and “there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC.”
‘Image hashtags’ refers to the use of algorithms to assign a unique ‘hash value’ to an image – which has been likened to a ‘digital fingerprint’ making it easier for all platforms to remove content deemed harmful.
While Apple insists it screens for image hashes “validated to be CSAM” by child safety organizations, digital rights watchdog Electronic Frontier Foundation (EEF) had previously warned that this would lead to “mission creep” and “overreach.”
Also on rt.com
“One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content,” the non-profit warned last week, referring to the Global Internet Forum to Counter Terrorism (GIFCT).
Apple countered that, because it “does not add to the set of known CSAM image hashes,” and because the “same set of hashes” are stored in the OS of every iPhone and iPad users, it is “not possible” to use the system to target users by “injecting” non-CSAM images into it.
“Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” the company vows in its FAQ.
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” it added.
However, the company has already been criticized for using “misleading phrasing” to avoid explaining the potential for “false positives” in the system – the “likelihood” of which Apple claims is “less than one in one trillion [incorrectly flagged accounts] per year”.
Here’s an example of what I mean about misleading phrasing. Apple says this system reports CSAM (true!) and it doesn’t report on photos that are exclusively on-device and aren’t synced to iCloud (also true!). But what about false positives for photos that *are* synced to iCloud? pic.twitter.com/E7zg5kHLnj
— Jonathan Mayer (@jonathanmayer) August 9, 2021
Like this story? Share it with a friend!