How to Prevent Apple From Scanning Your IPhone’s Photos Before IOS 15
Starting with iOS 15 and iPadOS 15, Apple is going to apply a new child protection policy when it comes to scanning photos you upload to iCloud. This policy will help Apple report illicit child pornography images to authorities, and on the surface, Apple seems to be doing a good job. But there is a lot of controversy and confusion about how they do it, so let’s talk about how it works and then what you can do if you want Apple not to scan photos from your iPhone.
How Apple’s iPhone Scanning Feature Works
Part of the confusion stems from the fact that Apple announced two child safety features together, but they work in very different ways.
The first is the child pornography scan feature for iCloud photos. Here, Apple scans photographs for child pornography digital fingerprints and compares them to the CSAM (Child Sexual Abuse Materials) database for illegal images. CSAM is supported by the Center for Missing and Exploited Children, a quasi-government agency in the United States.
The second feature is a machine learning-based consent feature limited to the Messages app on iPhone and iPad. This is used to alert children or their parents to pornographic images in the Messages app.
The controversy revolves around the first iCloud Photo Scanning feature, which is enabled by default for all iCloud Photo users. When your iPhone uploads a photo to iCloud Photos (if you have iCloud Photos turned on), there is a multi-part algorithm that does some analysis on the photo on your device and sends it to iCloud. Then iCloud does the other part of the analysis; if you hit the threshold of 30 known images of child pornography, Apple will tag your account.
Apple’s manual browsing process then begins and Apple is aware of the tagged images (and not the rest of the images). Apple then sends the photos to the CSAM program, and power shifts from there.
Apple claims that this program only works with the well-known child pornography database from CSAM and does not tag regular pornography, nude photos, or photos of your baby in the bathroom, for example. And Apple’s process is safe here, and Craig Federighi goes into technical detail in a recent interview with the WSJ . If you’re interested, watch the video below.
According to Apple, the actual scanning of the photos is not taking place here. Basically, Apple assigns a “neural hash” (a sequence of numbers that identifies your photo) to your photo and then compares it to the hashes from the CSAM database. It then saves this process along with an image in what Apple calls “Security Voucher.”
It then performs another analysis and comparison based on those hashes; if 30 security vouchers have matches with CSAM images, only then will your account be flagged by the system so that reviewers can actually check if there are illegal images and report the images and account.
How to prevent Apple from scanning photos on your iPhone
So now that you know how the system works, you can choose if you want Apple not to do this. This scan only happens when photos are uploaded to iCloud.
Photos sent by messaging apps like WhatsApp or Telegram are not scanned by Apple. That said, if you don’t want Apple to perform this scan at all, your only option is to turn off iCloud Photos. To do this, open the app “Settings” on your iPhone or iPad, go to the “Photo” and turn off “the iCloud Photo” function. In the pop-up window, select the Download Photos and Videos option to download photos from your iCloud Photo Library.
You can also use the iCloud website to download all of the photos to your computer. Your iPhone will stop uploading new photos to iCloud and Apple won’t scan your photos.
Looking for an alternative? In fact, it does not exist. All major cloud backup providers have the same scanning functionality, they just do it entirely in the cloud (while Apple uses a combination of on-device and cloud scanning). If you don’t need this kind of photo scanning, use local backups, NAS, or a backup service that is fully encrypted.