All Apple to scan iPhone for children’s sex abuse images
Apple has announced the system details to find children’s sexual abuse (CSAM) on US customers.Prior to the image is saved to iCloud’s photo, technology will look for a known CSAM match.
But there are worries of privacy that technology can be expanded to scan cellphones for content that is prohibited or even political speeches. Father confounding that the technology can be used by the authoritarian government to spy on its citizens.
They said that the new version of iOS and iPados – because it will be released later this year – will have “new cryptographic applications to help limit the spread of CSAM online, while designing user privacy”.
The system works by comparing images with databases of child sexual abuse images collected by US national centers for lost and exploited children (NCMEC) and other child safety organizations.
These images translated into “hash”, numeric code that can be “matched” with the image on the Apple device.
They said this technology would also capture the edited but similar original image version.
‘High level of accuracy’
“prior to the image is stored in iCloud photo, the on-device matching process is made for the image against the known CSAM hash,” Apple said.
The company claims the system has a “very high level of accuracy and ensures less than one trillion chances per year by misstead of the given account”.
They said that it would manually review each report to confirm there was a match. Then it can take steps to deactivate the user account and report to law enforcement.
The company said that new technology offers “significant” privacy benefits for existing techniques – because Apple only learns about user photos if they have a collection of CSAMs known in their iCloud Photos account.
But some privacy experts have voiced concerns.”Regardless of what Apple’s long-term plans, they have sent a very clear signal. In their thinking (very influential), it is safe to build a system that scans the user’s cellphone for the prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said.
“that if they turned out to be true or wrong at that point it was almost not important. It will damage the dam – the government will sue from everyone.”