Apple introduced two distinct security features a few days ago meant to fight child pornography. One is a private way to scan for Child Sexual Abuse Material (CSAM) photos stored in iCloud. The other is an iMessage tool that detects nudity in messages sent to children and can notify parents of such an occurrence. The …
The post Craig Federighi explains why the iPhone’s new photo scanning feature isn’t a backdoor appeared first on BGR.
Apple introduced two distinct security features a few days ago meant to fight child pornography. One is a private way to scan for Child Sexual Abuse Material (CSAM) photos stored in iCloud. The other is an iMessage tool that detects nudity in messages sent to children and can notify parents of such an occurrence. The two are not related. Apple isn’t scanning all the photos on your phone in search of porn. Apple explained as much in the days that followed the announcement, hoping to address unsurprising privacy concerns. Some worried that Apple would be opening Pandora’s box by effectively building a nascent backdoor into the iPhone.
The worry was that some governments might pressure Apple into customizing their photo scanning tool to look for particular material on iPhones. Craig Federighi, SVP of Software Engineering at Apple, tried to dispel the notion that Apple’s tool is a backdoor in a new interview.
Price: Was $50, Now $39.99
Buy Now
The post Craig Federighi explains why the iPhone’s new photo scanning feature isn’t a backdoor appeared first on BGR.
Today's Top Deals
Trending Right Now:
Craig Federighi explains why the iPhone’s new photo scanning feature isn’t a backdoor originally appeared on BGR.com on Fri, 13 Aug 2021 at 16:13:28 EDT. Please see our terms for use of feeds.