The Head of WhatsApp And Other Technical Experts Fought Back On Apple's Child Safety Plan

Over the weekend, as an open letter with more than 4,000 signatures circulated on the Internet, voices expressing concerns and disappointments about Apple’s new child safety measures grew louder. The Apple Privacy Letter requires iPhone manufacturers to "reconsider its technology rollout" so as not to cancel "technical experts, academics, and policy advocates for decades of work on privacy protection measures."

Apple's plan announced on Thursday involves obtaining the hash values of images uploaded to iCloud and comparing them with a database containing known CSAM image hash values. According to Apple, this allows it to encrypt user data and run analytics on the device, while still allowing it to report users to authorities when it finds users sharing images of child abuse. Another aspect of Apple’s child safety policy is that children under the age of 13 can choose to warn their parents if they send or view photos that contain pornographic content. An internal Apple memo acknowledges that people will "worry about the impact of these systems."

Will Cathcart, the head of WhatsApp, said in a Twitter post that his company will not take security measures, calling Apple's approach "very worrying." Cathcart stated that WhatsApp’s system of combating child exploitation partially used user reports and retained encryption like Apple, and led the company to report more than 400,000 cases to the National Center for Missing and Exploited Children in 2020. (Apple also cooperates with the center) its CSAM inspection work. 

Due to privacy issues, Facebook, the owner of WhatsApp, has reason to launch an attack on Apple. Apple’s changes to the way it works for ad tracking in iOS 14.5 sparked a struggle between the two companies. Facebook purchased newspaper ads and criticized Apple’s privacy changes as harmful to small businesses. Apple shot back that the change "just required" users to choose whether to be tracked or not.

The list of people and organizations expressing concerns about Apple’s policies includes Edward Snowden, the Electronic Frontier Foundation, and professors. We have collected some of these responses here as an overview of some of the criticisms of Apple’s new policy.

Matthew Green, an associate professor at Johns Hopkins University, pushed back on the feature before it was publicly announced. He tweeted about Apple’s plans and about how the hashing system could be abused by governments and malicious actors.

The EFF released a statement that blasted Apple’s plan, more or less calling it a “thoroughly documented, carefully thought-out, and narrowly-scoped backdoor.” The EFF’s press release goes into detail on how it believes Apple’s Child Safety measures could be abused by governments and how they decrease user privacy.
Kendra Albert, an instructor at Harvard’s Cyberlaw Clinic, has a thread on the potential dangers to queer children and Apple’s initial lack of clarity around age ranges for the parental notifications feature.
Edward Snowden retweeted the Financial Times article about the system, giving his own characterization of what Apple is doing.
Politician Brianna Wu called the system “the worst idea in Apple History.”
Writer Matt Blaze also tweeted about the concerns that the technology could be abused by overreaching governments, trying to prevent content other than CSAM.
Epic CEO Tim Sweeney also criticized Apple, saying that the company “vacuums up everybody’s data into iCloud by default.” He also promised to share more thoughts specifically about Apple’s Child Safety system.
Not every reaction has been critical, however. Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) calls Apple’s work “a major step forward” for efforts to eliminate CSAM.