WhatsApp chief and other tech experts strike back at Apple’s child safety plan

WhatsApp Will Not Adopt Apple’s New Child Safety Measures Intended To Stop The Spread Of Child Abuse Images, Says The head of WhatsApp Will Cathcart. In a Twitter thread, he explains that he believes Apple “has created software that can scan all of the private photos on your phone,” and said Apple has taken the wrong path in trying to improve its response to child sexual abuse, or CSAM. .

Apple’s plan, which it announced on Thursday, is to take hashes of images uploaded to iCloud and compare them to a database of known CSAM image hashes. According to Apple, this allows it to keep user data encrypted and run the scan on the device while allowing it to report users to authorities if they are found to be sharing images of child abuse. Another part of Apple’s child safety strategy is to potentially warn parents if their child under 13 send or view photos containing sexually explicit content. An internal Apple memo acknowledged that people would be “worried about the implications” of the systems.

Cathcart calls Apple’s approach “Very concerning,” claiming that this would allow governments with different ideas about what type of images are and are not acceptable to ask Apple to add non-CSAM images to the databases against which it compares the images. Cathcart says WhatsApp’s system to combat child exploitation, which partly uses user reports, preserves encryption like Apple’s, and has driven the company reporting over 400,000 cases at the National Center for Missing and Exploited Children in 2020. (Apple is also working with the Center for its CSAM detection efforts.)

WhatsApp owner Facebook has reason to take to Apple over privacy concerns. Apple’s changes to how ad tracking works in iOS 14.5 sparked a fight between the two companies, with Facebook buying newspaper ads criticizing Apple’s privacy changes as damaging small businesses. Apple fired back, saying the change “simply requires” that users have a choice of whether or not to be tracked.

It’s not just WhatsApp that has criticized Apple’s new child safety measures. The list of people and organizations raising concerns includes Edward Snowden, the Electronic Frontier Foundation, professors, etc. We’ve rounded up some of those reactions here to give an overview of some of the criticisms leveled against Apple’s new policy.

Matthew Green, associate professor at Johns Hopkins University, pushed back the feature before it was publicly announced. He tweeted about Apple’s plans and how the hashing system could be abused by governments and malicious actors.

The EFF issued a statement that lambasted Apple’s plan, calling it more or less a “carefully documented, carefully thought out, narrow-scope backdoor.” The EFF press release explains in detail how it believes Apple’s child safety measures could be abused by governments and how they reduce user privacy.

Kendra Albert, an instructor at Harvard’s Cyberlaw Clinic, has a common thread on the potential dangers to gay children and Apple’s initial lack of clarity regarding the age ranges for the parental notification function.

Edward Snowden retweeted the Financial times article on the system, giving its own characterization of what Apple does.

Politician Brianna Wu called the system “the worst idea in Apple’s history.”

Writer Matt Blaze also tweeted about concerns that the technology could be abused by excessive governments, trying to prevent content other than CSAM.

Epic CEO Tim Sweeney also slammed Apple, saying the company “sucks everyone’s data into iCloud by default.” He also promised to share more thoughts on Apple’s child safety system.

However, not all reactions have been critical. Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) calls Apple’s work “a big step forward” for efforts to end CSAM.

Previous How to set up two-step verification in Skype
Next Billie Eilish slammed for snubbing superfan during Zoom encounter as followers claim she 'changed for the worse'

No Comment

Leave a reply

Your email address will not be published.