Aspen
SENIOR MEMBER
- Joined
- Sep 18, 2019
- Messages
- 3,585
- Reaction score
- 1
- Country
- Location
US iPhone users’ photos will be scanned by Apple’s automated “neuralMatch” system for pictures of child **** and abuse, according to reports. Security researchers are alarmed the scheme threatens privacy and encryption.
Financial Times reported on the plan Thursday, citing anonymous sources briefed on Apple’s plans. The scheme was reportedly shared with some US academics earlier in the week in a virtual meeting.
Dubbed “neuralMatch,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said. The program is initially intended to be rolled out in the US only.
The plan was described as a compromise between Apple’s promise to protect customer privacy and demands from the US government, intelligence and law enforcement agencies, and child safety activists to help them battle terrorism and child pornography.
Researchers who found out about the plan were alarmed, however. Matthew Green, a security professor at Johns Hopkins University, was the first to tweet about the issue in a lengthy thread late on Wednesday.
The problem with this approach, Green warned, is that whoever controls the list of prohibited imagery “can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you.”
Depending on how the system works, “it might be possible for someone to make problematic images that ‘match’ entirely harmless images. Like political images shared by persecuted groups,” he added. While he could see internet trolls doing it as a prank, Green added “there are some really bad people in the world who would do it on purpose.”
“I don’t particularly want to be on the side of child **** and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,” he tweeted.
Several other researchers echoed Green’s concerns. Apple’s move was “tectonic” and a “huge and regressive step for individual privacy,” Alec Muffett, a security researcher and privacy campaigner who worked at Facebook and Deliveroo, told FT.
“Apple are walking back privacy to enable 1984,” he added.
Ross Anderson, professor of security engineering at the University of Cambridge, called it “an absolutely appalling idea” that will lead to “distributed bulk surveillance” of people’s phones and laptops.
Word about Apple’s snooping plan comes just weeks after the revelation that iPhones around the world – but reportedly not in the US, for some reason – were targeted by Pegasus, spying malware deployed by the Israeli company NSO, to keep tabs on over 50,000 people, including journalists, dissidents and even heads of state.
Financial Times reported on the plan Thursday, citing anonymous sources briefed on Apple’s plans. The scheme was reportedly shared with some US academics earlier in the week in a virtual meeting.
Dubbed “neuralMatch,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said. The program is initially intended to be rolled out in the US only.
The plan was described as a compromise between Apple’s promise to protect customer privacy and demands from the US government, intelligence and law enforcement agencies, and child safety activists to help them battle terrorism and child pornography.
Researchers who found out about the plan were alarmed, however. Matthew Green, a security professor at Johns Hopkins University, was the first to tweet about the issue in a lengthy thread late on Wednesday.
The problem with this approach, Green warned, is that whoever controls the list of prohibited imagery “can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you.”
Depending on how the system works, “it might be possible for someone to make problematic images that ‘match’ entirely harmless images. Like political images shared by persecuted groups,” he added. While he could see internet trolls doing it as a prank, Green added “there are some really bad people in the world who would do it on purpose.”
“I don’t particularly want to be on the side of child **** and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,” he tweeted.
Several other researchers echoed Green’s concerns. Apple’s move was “tectonic” and a “huge and regressive step for individual privacy,” Alec Muffett, a security researcher and privacy campaigner who worked at Facebook and Deliveroo, told FT.
“Apple are walking back privacy to enable 1984,” he added.
Ross Anderson, professor of security engineering at the University of Cambridge, called it “an absolutely appalling idea” that will lead to “distributed bulk surveillance” of people’s phones and laptops.
Word about Apple’s snooping plan comes just weeks after the revelation that iPhones around the world – but reportedly not in the US, for some reason – were targeted by Pegasus, spying malware deployed by the Israeli company NSO, to keep tabs on over 50,000 people, including journalists, dissidents and even heads of state.
Apple to scan photos on all US iPhones for ‘child abuse imagery’ as researchers warn of impending ‘1984’ – reports
US iPhone users’ photos will be scanned by Apple’s automated “neuralMatch” system for pictures of child **** and abuse, according to reports. Security researchers are alarmed the scheme threatens privacy and encryption.
www.rt.com