APPLE has unveiled plans to scan U.S. iPhones for images of child sexual abuse – a move that has drawn applause from child protection groups but raised concerns among some security researchers. Those concerned claim the system could be misused – particularly by governments who may be looking to spy on their citizens. The tool, […]
APPLE has unveiled plans to scan U.S. iPhones for images of child sexual abuse – a move that has drawn applause from child protection groups but raised concerns among some security researchers.
Those concerned claim the system could be misused – particularly by governments who may be looking to spy on their citizens.
Apple has unveiled plans to scan U.S. iPhones for images of child sex abuse[/caption] The has drawn applause from child protection groups but raised concerns among some security researchers[/caption]The tool, called neuralMatch, is designed to detected known images of child sexual abuse and will scan such images before they are uploaded to iCloud.
If the system finds a match, the image will be reviewed by a human.
Once child pornography has been confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will, however, only flag images that are already in the center’s database of known child pornography.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography.
That could fool Apple’s algorithm and alert law enforcement, Green said.
He added that researchers have been able to trick such systems pretty easily.
Other abuses could include government surveillance of dissidents or protesters.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'” Green asked. “Does Apple say no? I hope they say no, but their technology wont say no.”
Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images.
Apple has used those to scan user files stored in its iCloud service – which is not as securely encrypted as its on-device data – for child pornography.
The company has been under government pressure for years to allow for increased surveillance of encrypted data.
Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.
Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.
“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have lifesaving potential for children.
Meanwhile the Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s compromise on privacy protections a shocking about-face for users who have relied on the company’s leadership in privacy and security.
The latest changes will roll out this year as part of updates to Apple’s operating software for iPhones, Macs and Apple Watches[/caption]