Shay Diddy

Shay Diddy

Want to know more about Shay Diddy? Get the official bio, social pages, articles and more!Full Bio

 

Apple to scan US iPhones for images of child sexual abuse

Apple will begin reporting images of child exploitation uploaded to iCloud in the U. S. to law enforcement, the company said in a statement on Thursday.

The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.

Apple started testing the system on Thursday, but most U. S. iPhone users won't be part of it until an iOS 15 update later this year.

The move brings Apple in line with other cloud services which already scan user files for content that violates their terms of service, including child exploitation images.

For years, Apple has defended device encryption and operates in countries with fewer speech protections than the U.S. Law enforcement officials around the world have also pressured Apple to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism.

Thursday's announcement is a way for Apple to address some of those issues without giving up some of its engineering principles around user privacy. Before an image is stored in Apple's iCloud, Apple matches the image's hash against a database of hashes provided by National Center for Missing and Exploited Children (NCMEC). That database will be distributed in the code of iOS beginning with an update to iOS 15. The matching process is done on the user's iPhone, not in the cloud, Apple said. If Apple then detects a certain number of violating files in an iCloud account, the system will upload a file that allows Apple to decrypt and see the images on that account. A human will then manually review the images to confirm whether or not there's a match. Apple will only review images that match content that's already known and reported to these databases — it won't be able to detect parents' photos of their kids in the bath, for example. This because these images wouldn't be part of the NCMEC database.

If the person doing the manual review concludes the system did not make an error, then Apple will disable the user's iCloud account, and send a report to NCMEC or notify law enforcement if necessary. The system only works on images uploaded to iCloud, which individual users can turn off. Photos or other images on a device that haven't been uploaded to Apple servers won't be part of the system.


Sponsored Content

Sponsored Content