Apple delays child protection measures after privacy concerns

WASHINGTON: Apple announced on Friday (Sep 3) that it is delaying the roll-out of its controversial new child pornography protection tools, accused by some of undermining the privacy of its devices and services.

The Silicon Valley giant said last month that iPhones and iPads would soon start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States.

However, digital rights organisations quickly noted that the tweaks to Apple’s operating systems create a potential “backdoor” into gadgets that could be exploited by governments or other groups.

Apple, in announcing the delay, cited the feedback from customers, advocacy groups, researchers and others.

“We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement.

New technology allows the software powering Apple mobile devices to match abusive photos on a user’s phone against a database of known child sex abuse images provided by safety organisations, then flag the images as they are uploaded to Apple’s online iCloud storage, according to the company.

The system, if and when it goes into use, would be “powered by a cryptographic technology” to determine “if there is a match without revealing the result”, unless the image was found to contain depictions of child sexual abuse.

“INCREDIBLY DISAPPOINTING”

Critics of the policy welcomed the delay, but some child safety advocates urged Apple not to bend to those worried by the policy.

“This isn’t a fancy new Touchbar: it’s a privacy compromise that affects 1bn users,” tweeted Matthew Green, who teaches cryptography at Johns Hopkins University. “You need to justify escalations like this.”

Though Apple cited feedback from advocacy groups in its decision, not all welcomed the pause.

“This is incredibly disappointing,” tweeted Andy Borrows, head of child safety online at the National Society for the Prevention of Cruelty to Children.

“Apple had adopted a proportionate approach that sought to balance user safety and privacy, and should have stood their ground,” he added.

The new image-monitoring feature was to be part of a series of tools heading to Apple mobile devices, according to the company.

The move would represent a major shift for Apple, which has until recently resisted efforts to weaken its encryption that prevents third parties from seeing private messages.

Apple notably resisted a legal effort to weaken iPhone encryption to allow authorities to read messages from a suspect in a 2015 terror attack in San Bernardino, California.

FBI officials have warned that so-called “end to end encryption”, where only the user and recipient can read messages, can protect criminals, terrorists and pornographers even when authorities have a legal warrant for an investigation.

Apple argued in a technical paper that the scanning technology developed by cryptographic experts “is secure, and is expressly designed to preserve user privacy”.

The company said it would have limited access to the violating images which would be flagged to the National Center for Missing and Exploited Children, a non-profit organisation.

Facebook, which has faced criticism that its encrypted messaging app facilitates crime, has been studying the use of artificial intelligence to analyse the content of messages without decrypting them, according to a recent report by The Information.

Source link