What's new

Apple will scan photos stored on iPhones and iCloud for child abuse imagery

truthfollower

BANNED
Joined
Mar 8, 2019
Messages
1,841
Reaction score
-4
Country
Pakistan
Location
Pakistan
Apple will scan photos stored on iPhones and iCloud for child abuse imagery
123
The feature will roll out in the US first
By Jay Peters@jaypeters Aug 5, 2021, 12:55pm EDT
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.


The iPhone 12, in blue.
Photo by Vjeran Pavic / The Verge
Update August 5th, 3:21PM ET: Apple has announced more about what the Financial Times reported and revealed new tools coming to iMessage that warn children about sexually explicit photos. The new features will be coming later this year as updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. You can read more about them on Apple’s website. Our original article follows.
Apple plans to scan photos stored on iPhones and iCloud for child abuse imagery, according the Financial Times. The new system could help law enforcement in criminal investigations but may open the door to increased legal and government demands for user data.
The system, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will roll out first in the US. Photos will be hashed and compared with a database of known images of child sexual abuse.
THE SYSTEM WILL BE USED FIRST IN THE US
“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”
“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
Apple already checks iCloud files against known child abuse imagery, like every other major cloud provider. But the system described here would go further, allowing central access to local storage. It would also be trivial to extend the system to crimes other than child abuse — a particular concern given Apple’s extensive business in China.
The company informed some US academics about it this week, and Apple may share more about the system “as soon as this week,” according to two security researchers who were briefed on Apple’s earlier meeting, the Financial Times reports.
Apple has previously touted the privacy protections built into its devices, and famously stood up to the FBI when the agency wanted Apple to build a backdoor into iOS to access an iPhone used by one of the shooters in the 2015 attack in San Bernardino. The company did not respond to a request for comment on the Financial Times report.


Apple will scan photos stored on iPhones and iCloud for child abuse imagery - The Verge
 
. .

Latest posts

Pakistan Defence Latest Posts

Back
Top Bottom