What Is Apple CSAM? New Child Safety Protections A Double-edged Sword

Apple has announced new Expanded Protections for Children which will roll out to their full estate of iOS, macOS, watchOS and iMessage. The very headline of the announcement would normally put it beyond question that this is a good idea. However, simply scratch the surface and you’ll find a plethora of data protection and privacy issues. A surprise considering Apple now markets itself as privacy first, stating “privacy is a fundamental human right“. So what’s going on?

Apple’s New Child Protections

A while back, Apple confirmed plans to implement new child protection measures across hardware like iPhone and, more importantly, their cloud storage service, iCloud. The details of this system to process child sexual abuse materials (CSAM) have now been confirmed. The American phone maker will release a couple of things but it’s the smart processing of images in iCloud that’s making headlines.

Apple plans to release software into their eco-system that will scan for content relating to the sexual abuse of children. In short, Apple retains a database of know child abuse images. Using this new technology, they will then look for matches in the iCloud Photos storage of users. Should a match be found, Apple will then be able to share information with the police to take the necessary steps to resolve any issues.

The Problem With Apple’s New Child Protection Measures

On the surface, how could you ever have a problem with anything that’s designed to stop child abuse? Well, there lies the problem. By releasing this software in an effort to stifle the horrendous proliferation of child abuse online, it’s very hard to question Apple’s motives. Indeed, I don’t think anyone, abusers aside has a problem with efforts to stop child abuse images. Yet, many are up in arms after hearing Apple’s plans because there are massive privacy concerns here.

How Else Could This Tech Be Used?

The concerns of privacy experts, myself and frankly most of Twitter from what I’ve seen is not what this tech will do. It’s what it could do. Take the child abuse out of this. One tweak in the algorithm and this same system could be used for any collection of images.

First of all, not all nations are a free or well-intentioned as the ones we find ourselves in. Sure, you might question that but other nations are outright corrupt. What’s to stop this technology from being used to find images or memes making fun of an unpopular political regime.

I’m no fan of WhatsApp. In fact, for privacy reasons, I quit the communications app completely. But even a WhatsApp head, Will Cathcart is concerned that “Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control”.

We shouldn’t forget that Apple and Facebook really don’t like each other and all that, but still. His point is valid.

Apple’s Response To CSAM System Concerns

In response to the concerns around this processing system, Apple released a six-page FAQ. The document seeks to put some fears at ease, answering questions like will Apple scan all photos on an iPhone. The answer is no, this only applies to photos in iCloud Photos.

Apple also confirms that it is taking these steps in an effort to protect children while also protecting privacy, a goal that Apple claims this system helps achieve.

Apple On Security OF CSAM System

The really interesting questions surround security for the CSAM detection system in iCloud Photos and whether or not the system can be used to detect anything other than CSAM. This is one of those times where a simple “no” is the best answer. However, Apple includes a word that will do little too allay the fears of privacy experts; “process”. In answering this question, Apple says their “process is designed to prevent that [the system being used to detect anything other than CSAM] from happening”. This basically means that yes, the system could be tweaked to crosscheck any database of images against iCloud Photos.

Apple On Rogue Governments

So, could a government force Apple to use this system for their own benefit? You’ll have to take Apple’s word for it, but it says it “will refuse any such demands” from governments to add non-CSAM images to the database. Apple says it has “faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future”.

Apple On False Alarms

Here comes that “process” klaxon again. Could Apple’s CSAM system be abused and falsely flag innocent people to law enforcement? Apple says no and that “the system is designed to be very accurate”. Now, that’s cool and all, but you’ll find smartphones were designed to not go on fire. Space shuttles were designed to not fail and explode. While Apple says they have taken steps to avoid false flags, it’s not impossible.

Apple concedes that “the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year”. Accounts that get flagged will be passed to a human for review. Personally, I see this as the weakest part given humans have been known to be terrible error prone.

What About Other Services?

Many other cloud storage services already have systems in place similar to this. The problem with Apple is that it has been a privacy beacon of hope for a long time. This is a significant step back for people who want a privacy-focused phone. You can disable iCloud Photos but the beauty of Apple is the eco-system that works so well.

Although, I’ve had my fair share of tearing out hair because of iCloud.

Will Apple Launch iCloud Photo Scanning In Ireland?

Apple will roll out this use of what it’s calling Neural Hash later this year. It will launch in the United States first and this is an important side note. Apple has also confirmed that any additional markets for roll-out will be subject to local laws and regulations allowing for such a system.

While it remains possible Apple will roll out CSAM scanning in iCloud Photos here in Ireland, GDPR and the whole concept of processing user accounts in the EU from the US will likely delay things.

This is a really difficult topic to get your head around. Stopping child abuse at any cost, for me, is worth a loss of privacy. However, the technology could so easily be tweaked for utter evil that it terrifies me.

Update: This article was updated to include Apple’s response to concerns around their new iCloud Photos CSAM-detection system.

Written by

Marty
Martyhttps://muckrack.com/marty-goosed
Founding Editor of Goosed, Marty is a massive fan of tech making life easier. You'll often find him testing something new, brewing beer or finding some new foodie spots in Dublin, Ireland. - Find me on Threads

Help Pay the Bills

Related articles

RIP.ie Charging €100 For Death Notices Really Isn’t A Big Deal

I'm in an unusual position to comment on the...

Navigating Amazon Prices Across Europe: A Guide to Saving Money

As a savvy online shopper, you're likely aware that...

Building a Gaming PC in Ireland in 2025

Two years ago I built a gaming PC. It...

Shane Collective Launches Global Web App to Empower LGBTQ+ Communities

Here at Goosed.ie, we’re always on the lookout for...

Leaving the Church with GDPR: The Final Update

Can you use GDPR law to delete your church...

Keep Reading Goosed

Sponsored Articles