According to the British "Financial Times" report, Apple plans to scan photos stored on iPhone and iCloud, looking for images of child abuse. The new system can help law enforcement agencies conduct criminal investigations, but it may lead to increased legal and government demand for user data.
The system, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will roll out first in the US. Photos will be hashed and compared with a database of known images of child sexual abuse.
“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”
“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
According to the British Financial Times, two security researchers who listened to Apple’s earlier meeting briefing said that the company informed someAmerican scholars of the matter this week and that Apple may share more information "as soon as this week." Information about the system.
Apple previously touted the built-in privacy protection features of its devices, and objected to the FBI when it asked Apple to build a backdoor in iOS to access the iPhone used by a gunman in the San Bernardino attack in 2015. The company did not respond to a request for comment on the Financial Times report.