Apple to scan iPhones for child sex abuse images

Apple to scan iPhones for child sex abuse images

Apple has announced details of a system to seek out child sexual assault material (CSAM) on US customers’ devices.Before a picture is stored onto iCloud Photos, the technology will look for matches of already known CSAM Apple said that if a match is found a person’s reviewer will then assess and report the user to enforcement .

However there are privacy concerns that the technology might be expanded to scan phones for prohibited content or maybe political speech Experts worry that the technology might be employed by authoritarian governments to spy on its citizens.

Apple said that new versions of iOS and iPadOS – thanks to be released later this year – will have “new applications of cryptography to assist limit the spread of CSAM online, while designing for user privacy”.

The system works by comparing pictures to a database of known child sexual assault images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.

Those images are translated into “hashes”, numerical codes which will be “matched” to a picture on an Apple device Apple says the technology also will catch edited but similar versions of original images.

‘High level of accuracy’
“Before a picture is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple said.

The company claimed the system had an “extremely high level of accuracy and ensures but a 1 in one trillion chance per annum of incorrectly flagging a given account”.

Apple says that it’ll manually review each report back to confirm there’s a match. It can then take steps to disable a user’s account and report back to enforcement .

The company says that the new technology offers “significant” privacy benefits over existing techniques – as Apple only learns about users’ photos if they need a set of known CSAM in their iCloud Photos account.

However some privacy experts have voiced concerns.

“Regardless of what Apple’s future plans are, they’ve sent a really clear signal. In their (very influential) opinion, it’s safe to create systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said.

“Whether they end up to be right or wrong thereon point hardly matters. this may break the dam — governments will demand it from everyone.”

 

Leave a Reply

Your email address will not be published. Required fields are marked *