Apple under pressure to abandon child protection measures over privacy concerns
The fear is Apple's message scan technology could be abused
Privacy campaigners around the world have urged Apple to abandon plans to automatically scan message content for images of child sex abuse material (CSAM) and to prevent children from viewing objectionable content.
Although the 90 signatories of a letter coordinated by the Center for Democracy & Technology (CDT) acknowledged Apple’s positive intentions, they are concerned the that the technology built to monitor messages and notify authorities of abuse would undermine the principles of end-to-end encryption.
Specifically, they fear that if governments know the technology exists, they could demand Apple use it for other purposes beyond its intended use.
- These are the best business smartphones around today
- Here's the latest business apps for Android
- And the best business phone services
Apple encryption
“Although the new features are designed to protect children and reduce the spread of CSAM they will create new risks for children and could be used to censor speech and threaten the privacy and security of people around the world,” say the signatories.
“Messages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent.
“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit.”
The letter also voices concerns about a change to family accounts that will notify parents if an objectionable image is detected. Although this is designed to prevent the spread of pornography, there are fears that younger users in less tolerant households might be denied access to educational materials. For example, this could impact the wellbeing of LGBTQ+ youths with unsympathetic parents.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
“Algorithms designed to detect sexually explicit material are notoriously unreliable. They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery,” explains the letter. “An abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing.”
Apple has been a vocal supporter of encryption, and privacy has become a key feature of iOS. Indeed, more recent versions allow users to stop advertisers from tracking their activity across various applications.
In response to the concerns expressed since its announcement, Apple has published a document outlining the various policies it plans to put in place to prevent abuse and has said it would not acquiesce to any demand from a government for access to material beyond the stated use case.
Apple has been approached for comment by TechRadar Pro.
- Here are the best iPhone deals
Via CDT
Steve McCaskill is TechRadar Pro's resident mobile industry expert, covering all aspects of the UK and global news, from operators to service providers and everything in between. He is a former editor of Silicon UK and journalist with over a decade's experience in the technology industry, writing about technology, in particular, telecoms, mobile and sports tech, sports, video games and media.