Apple says new program to scan iPhones for child abuse material will not be used for political purposes

Apple says new program to scan iPhones for child abuse material will not be used for political purposes
AP Photo/Damian Dovarganes
Remove Ads

A week since Apple announced that it is rolling out a system to scan iCloud photos for child sexual abuse materials (CSAM), the tech giant says that it will refuse any requests for its system to be used for political ends. 

The automated system will allow iPhones to scan for child abuse imagery, and potentially open the door to surveillance of millions of people’s personal devices. 

The system Apple is rolling out has been met with a barrage of criticism from privacy rights activists and whistleblowers like Edward Snowden, who warned that the system is a form of “mass surveillance.” 

As reported by Rebel News, Snowden wrote: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*”

On Sunday, Apple released a statement to assuage concerns that governments could use the phones to perform intrusive surveillance on the public by adding non-CSAM images to the database. The company said it will refuse any such demands.

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. 

Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Apple previously rejected the Federal Bureau of Investigation’s demands to provide them with the means to decrypt a locked phone belonging to the 2015 San Bernardino mass shooters, Syed Rizwan Farook and Tashfeen Malik after the NSA was unable to do so. As a result, the FBI called on Apple to create a new version of the iOS operating system to be installed and run on the phone to disable its security features and grant the authorities access. 

As reported on Wired, Apple CEO Tim Cook rejected the request due to the company’s policy of not undermining its security features. If the iOS version got leaked, anyone with access to the software would be able to install it onto any given phone and gain unmitigated access to their data. The FBI eventually paid “professional hackers” to bypass the phone’s security and gain access to the data — none of which provided the authorities with information about the shooting or the shooters’ plans.

Remove Ads
Remove Ads

Fight Vaccine Passports Legal Cases

Check out this page for details on our "Fight Vaccine Passports" legal cases.

LEARN MORE

Don't Get Censored

Big Tech is censoring us. Sign up so we can always stay in touch.

Remove Ads