Apple ends controversial project to scan iCloud Photos backups for child sexual abuse images, after harsh criticism from security experts


Apple has ended its project to develop a tool for detecting known child pornography content stored in iCloud Photos. Critics have warned Apple about opening Pandora’s box, saying some states may require the feature to be diverted for other purposes, including mass surveillance. After a thorough review, Apple announced on Wednesday that it is ending the project and that the company will focus on end-to-end encryption of iCloud backups in the future. Apple believes that child sexual abuse can be prevented before it happens.

In August 2021, Apple announced a plan to scan images stored by users in iCloud Photos for child sexual abuse images (CSAM). The tool is supposed to preserve privacy and allow Apple to detect and report potentially problematic and abusive content without revealing anything. But the move was controversial, and it quickly drew widespread criticism from privacy and security researchers and digital rights advocacy groups, who feared that the surveillance capability itself was being used. in an abusive way.

This could harm the privacy and security of iCloud users around the world. In early September 2021, Apple said it would pause the feature’s rollout to “gather feedback and make improvements before releasing critically important new child safety features.” . In other words, a launch is always planned. But on Wednesday, Apple said that in response to feedback and advice it received about the effects of this initiative, the CSAM detection tool for photos stored in iCloud is dead.

After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we’re ramping up our investment in the communications security feature we first made available in December 2021. We’ve decided not to go ahead. with our CSAM detection tool previously suggested for iCloud Photos. Children can be protected without companies combing through personal data, Apple said in a statement sent to the media.

We will continue to work with governments, child advocates and other businesses to help protect young people, preserve their right to privacy and make the internet a safer place for children and for all of us,” added the company. He planned to report iCloud accounts with hashes of CSAM photos to the US National Center for Missing and Exploited Children (NCMEC), a nonprofit that works with US law enforcement. Apple also says it has a one-in-a-billion chance the tool will return a false positive.

Apple’s plans have been criticized by a wide range of people and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, political groups, academic researchers, rivals (such as Epic Games) and even some Apple employees. Some have argued that this tool creates a “backdoor” into Apple devices, which governments or law enforcement agencies can use to track users. There was also concern that someone would trick a third party into deliberately adding CSAM photos to their iCloud account.

Such a situation can cause great harm to the victim. Note that in August 2021, Apple plans to roll out three new child safety features, including a system for detecting known CSAM photos stored in iCloud Photos, a Communication Safety option that dims sexually explicit images in the iMessages app , and child exploitation resources for Siri. . Communication Safety launched in the US with iOS 15.2 in December 2021 and has since expanded to the UK, Canada, Australia and New Zealand.

Parents and guardians can opt in to these protections through family iCloud accounts. They work with Siri, Apple’s Spotlight search and Safari search. They notify when someone is viewing or researching CSAM-related materials and provide resources on the site to report content and get help. Additionally, the heart of the protection is the communication security for messages, which guardians can configure to provide warnings and resources to children if they receive or attempt to send images containing nudity .

The aim is to end the exploitation of children before it occurs or persist and to reduce the creation of new CSAMs. Along with the announcement about abandoning the project, Apple also announced a vast expansion of its end-to-end encryption offerings for iCloud, including adding protection for backups and photos stored on the service. cloud computing. While this second announcement was well received by iPhone users, it also received some criticism from child abuse agencies.

Anti-CSAM groups often oppose a wider rollout of end-to-end encryption because it makes user data inaccessible to tech companies, making it harder for CSAM analysis and reporting. Law enforcement agencies around the world have also cited the serious problem of child sexual abuse to oppose the use and expansion of end-to-end encryption, although many of these agencies have always been against end-to- end encryption in general because it can make some investigations more difficult.

Research has consistently shown, however, that end-to-end encryption is an important security tool for the protection of human rights and that the disadvantages of its implementation do not outweigh the advantages. Craig Federighi, vice president of software at Apple, said: Child sexual abuse can be prevented before it happens. This is where we put our energy into the future. He was asked directly about the impact of the expansion of encryption on the work of law enforcement officers investigating crimes.

Federighi responded: “Ultimately, securing customer data has huge implications for our security overall.” While there’s no specific timeline for expanding Communications Safety, Apple is working on adding the ability to detect nudity in videos sent via iMessages when protection is enabled. The company also plans to expand the offering beyond iMessages to its other communication apps. Ultimately, the goal is to allow third-party developers to integrate Communication Safety tools into their own applications.

According to Apple, the more these features become available, the more likely it is that children will get the information and support they need before they are exploited. Like other companies that have publicly struggled with how to deal with CSAM, Apple said it also plans to continue working with child safety experts to make it as easy as possible for its users to report of exploitative content and situations to defense organizations and law enforcement agencies.

And you?

What is your opinion on the subject?
What do you think of Apple’s reversal of its CSAM detection tool?
What are your thoughts on expanding end-to-end encryption offerings for iCloud?

See also

Apple announced end-to-end encryption for iCloud backups and introduced security key support for two-factor authentication

In internal memo, Apple addresses concerns over new Photo Swipe features, as experts push for privacy breakthrough

Apple is considering a system to detect child abuse images in users’ photo galleries, but security researchers fear widespread surveillance of iPhones

Leave a Reply

Your email address will not be published. Required fields are marked *