WhatsApp Will Not Adopt Apple’s New Child Safety Measures Intended To Stop The Spread Of Child Abuse Images, Says The head of WhatsApp Will Cathcart. In a Twitter thread, he explains that he believes Apple “has created software that can scan all of the private photos on your phone,” and said Apple has taken the wrong path in trying to improve its response to child sexual abuse, or CSAM. .
Apple’s plan, which it announced on Thursday, is to take hashes of images uploaded to iCloud and compare them to a database of known CSAM image hashes. According to Apple, this allows it to keep user data encrypted and run the scan on the device while allowing it to report users to authorities if they are found to be sharing images of child abuse. Another part of Apple’s child safety strategy is to potentially warn parents if their child under 13 send or view photos containing sexually explicit content. An internal Apple memo acknowledged that people would be “worried about the implications” of the systems.
I read the information published by Apple yesterday and I am worried. I think this is the wrong approach and a setback for the privacy of people all over the world.
People asked if we would adopt this system for WhatsApp. The answer is no.
– Will Cathcart (@wcathcart) August 6, 2021
Cathcart calls Apple’s approach “Very concerning,” claiming that this would allow governments with different ideas about what type of images are and are not acceptable to ask Apple to add non-CSAM images to the databases against which it compares the images. Cathcart says WhatsApp’s system to combat child exploitation, which partly uses user reports, preserves encryption like Apple’s, and has driven the company reporting over 400,000 cases at the National Center for Missing and Exploited Children in 2020. (Apple is also working with the Center for its CSAM detection efforts.)
WhatsApp owner Facebook has reason to take to Apple over privacy concerns. Apple’s changes to how ad tracking works in iOS 14.5 sparked a fight between the two companies, with Facebook buying newspaper ads criticizing Apple’s privacy changes as damaging small businesses. Apple fired back, saying the change “simply requires” that users have a choice of whether or not to be tracked.
It’s not just WhatsApp that has criticized Apple’s new child safety measures. The list of people and organizations raising concerns includes Edward Snowden, the Electronic Frontier Foundation, professors, etc. We’ve rounded up some of those reactions here to give an overview of some of the criticisms leveled against Apple’s new policy.
Matthew Green, associate professor at Johns Hopkins University, pushed back the feature before it was publicly announced. He tweeted about Apple’s plans and how the hashing system could be abused by governments and malicious actors.
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash and report them to Apple’s servers if too many photos appear.
– Matthew Green (@matthew_d_green) August 5, 2021
The EFF issued a statement that lambasted Apple’s plan, calling it more or less a “carefully documented, carefully thought out, narrow-scope backdoor.” The EFF press release explains in detail how it believes Apple’s child safety measures could be abused by governments and how they reduce user privacy.
Apple’s iMessage and iCloud filtering is not a slippery slope towards backdoors that suppress speech and make our communications less secure. We are already there: it is a complete system that only waits for outside pressure to make the slightest change. https://t.co/f2nv062t2n
– EFF (@EFF) August 5, 2021
Kendra Albert, an instructor at Harvard’s Cyberlaw Clinic, has a common thread on the potential dangers to gay children and Apple’s initial lack of clarity regarding the age ranges for the parental notification function.
The idea that parents are safe people that teens can have sex or sext conversations with is admirable, but in many cases, it is not. (And as far as I know, this stuff doesn’t just apply to kids under 13).
– Kendra Albert (@KendraSerra) August 5, 2021
The EFF reports that iMessage nudity notifications won’t be sent to parents if the child is between 13 and 17, but it’s nowhere in the Apple docs I can find. https://t.co/Ma1BdyqZfW
– Kendra Albert (@KendraSerra) August 6, 2021
Edward Snowden retweeted the Financial times article on the system, giving its own characterization of what Apple does.
Apple plans to modify iPhones to continuously search for contraband:
âThis is an absolutely appalling idea, because it will lead to distributed mass surveillance of our phones and laptops,â said Ross Anderson, professor of security engineering. https://t.co/rS92HR3pUZ
– Edward Snowden (@Snowden) August 5, 2021
Politician Brianna Wu called the system “the worst idea in Apple’s history.”
It’s the worst idea in Apple history, and I don’t say it lightly.
This destroys their credibility when it comes to confidentiality. It will be abused by governments. He will kill and deny homosexual children. This is the worst idea ever. https://t.co/M2EIn2jUK2
– Brianna Wu (@BriannaWu) August 5, 2021
Just to clarify: Apple’s scan does not detect child abuse photos. It detects a list of known prohibited images added to a database, which are initially child abuse images found circulating elsewhere. Images added over time are arbitrary. He doesn’t know what a child is.
– SoS (@SwiftOnSecurity) August 5, 2021
Writer Matt Blaze also tweeted about concerns that the technology could be abused by excessive governments, trying to prevent content other than CSAM.
In other words, not only must the policy be exceptionally robust, but the implementation as well.
– mat fire (@mattblaze) August 6, 2021
Epic CEO Tim Sweeney also slammed Apple, saying the company “sucks everyone’s data into iCloud by default.” He also promised to share more thoughts on Apple’s child safety system.
It’s excruciating how Apple sucks everyone’s data into iCloud by default, hides the 15+ separate options to disable parts of it in settings under your name, and forces you to have a junk email account. Apple would NEVER allow a third party to ship an app like this.
– Tim Sweeney (@TimSweeneyEpic) August 6, 2021
I’ll share some very detailed thoughts on this related topic later.
– Tim Sweeney (@TimSweeneyEpic) August 6, 2021
However, not all reactions have been critical. Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) calls Apple’s work âa big step forwardâ for efforts to end CSAM.
I believe in privacy, including for children whose sexual abuse is documented and disseminated online without consent. These efforts announced by @Apple are a big step forward in the fight to eliminate CSAM from the Internet. https://t.co/TQIxHlu4EX
– Ashton Kutcher (@aplusk) August 5, 2021
No Comment