Apple rolling out “mass surveillance” image scanner in new iPhone update

Apple rolling out “mass surveillance” image scanner in new iPhone update
Remove Ads

Privacy watchdog groups have drawn attention to tech giant Apple’s plans to upload software onto user’s iPhones that will scan for images of child sex abuse, warning that the move will create access to users’ private lives that will be used by governments.  

“Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices,” the Financial Times reported. “The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The software, called “neuralMatch,” is set to be released with iOS 14, which is set to be released next month, and will only be rolled out in the US, with apple writing in a blog post that the software will “evolve and expand over time.”  

The company claimed that the software provides “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.” 

Despite Apple’s claims, privacy watchdogs and academics are concerned about the technology, warning it may lead to privacy issues. 

Ross Anderson, professor of security engineering at the University of Cambridge, said: “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops.” 

The New York Times explained how the technology will work:

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.

Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked. Apple said this approach meant that people without child sexual abuse material on their phones would not have their photos seen by Apple or the authorities.

“If you’re storing a collection of [child sexual abuse material], yes, this is bad for you,” said Erik Neuenschwander, Apple’s privacy chief. “But for the rest of you, this is no different.” 

In response to the news, Edward Snowden tweeted, “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*”

Remove Ads
Remove Ads

Rebel News LIVE!

Rebel News Live is coming to Toronto on May 11. Get (early bird) tickets now to the most provocative, most interesting, and most freedom-oriented conference in Canada!

Buy tickets

Don't Get Censored

Big Tech is censoring us. Sign up so we can always stay in touch.

Remove Ads