Apple’s New CSAM tech is a big “no thank you”

Dangerous privacy-invasive tech needs some serious rethinking

Close-up Photo of Camera Shutter

An opinion piece by Douglas James.

Advancements in tech raise questions about protecting privacy. It’s critical, especially in these times, we protect our personal lives and data. Not only from a cybersecurity standpoint. Unless you’re comfortable with the idea of invisible hands sorting through your precious, personal info, you don’t want tech advancements to breach what little remaining threads of privacy we have today.

That’s why Apple’s newest “development” is just one big no thank you. Though the primary goal sounds good on paper, tools like these open a pandora’s box of privacy concerns.

What is CSAM?

Behind this motive, Apple rallies behind the banner of scanning for child abuse. But if you ask me, a company that engages in shaky labor tactics and resource exploitation doesn’t need to pull a “please think of the children” stance to justify what is, ultimately a spying tool. The CSAM development refers to “child sexual abuse materials” relying on databanks of “known instances” to create automated red flags. I realize this subject isn’t pleasant, but we can’t ignore invasive tech trends because the subject is controversial.

Furthermore, you can imagine something like this snagging a lot of easy support. After all, who defends abuse of children? Wouldn’t it be great to find abusers with this tech? Please, think of the children!

But first off, let’s not kid ourselves. I’m on my soapbox, but hear me out. I don’t believe for an instant Apple is operating on altruistic motives, and furthermore, if we want to get to the root of child abuse, you’re going to need to address the circumstances leading to it. Secondly, the sheer question marks surrounding “automated flags” raise so many alarms. What does an image constitute as abuse? What if there are false flags? Where does this data go? Who sees it? And who has the authority to act on it? How are you to know an automatic system isn’t just taking pictures without your consent? After all, it’s not like child predators will just turn off this tech.

The problem is, what Apple is doing is setting a very dangerous precedent and laying the foundation for some incredibly invasive technology. People sound alarms about surveillance states, well, here you go. Apple contests this setup has fail-safes installed, uses hash data to flag content, and doesn’t scan stored images. No part of me is put at ease hearing that.

Why it’s a problem

The intention behind this pursuit isn’t bad. Dealing with human trafficking by taking advantage of powerful technology sounds like a fantastic goal, especially for protecting minors and children. But this is a very fine line to walk. And this line doesn’t have enough guarantees to warrant such a dangerous development.

If a device can be scanned for one thing, it can be scanned for others too. How far does that go? What information on your phone is now off-limits? How would you even know? We’re already up to our necks in a digital world that collects, siphons and sells our info on unregulated markets so we can be barraged by the same unfunny fifteen-second ad for the 30th time.

And hang on to your hats, Apple wasn’t the only one about automated image scanning. Suggestions by circles in the EU have considered implementing mandates to scan for, and I quote “criminal and terrorist activity.”

Oh I’m sure that won’t lead to any problems or human rights violations at all. But, hey, again, it’s about stopping crime and protecting children, surely this all done for the good of justice, right?

The violations of human rights and privacy

I must sound like a massive contrarian, dying on this digital hill as I discuss privacy rights. But it’s true. You have to understand “for the children” is a banner by which some seriously dangerous legislation can skirt by, even when the actions don’t really help disaffected kids at all. Lots of talk about using tech to catch and hunt predators, not so much to address systemic issues and actual child welfare.

Now, take this notion of “identifying criminal behavior” and or “instances of child abuse” and extend into the laws of different nations. If Apple services are utilized by a nation that say, has less than a shining interest in women’s rights (considered criminal behavior), what’s to stop that nation using Apple’s new scan tech to target groups of people?

And, what triggers a scan from the phone appears to lack any cohesive probable cause. If you want to get real dystopian (because we know how the internet enjoys tossing out 1984 like it was yesterday’s news), federal and government powers could use these spying tools to indiscriminately target people fitting vague descriptions of criminal behavior.

And finally, given how insecure things are on the cybersecurity front, what’s to stop ransomware gangs and malicious parties from stealing that priceless government data. You don’t want this info in the hands of the “good guys,” so what would “the bad guys” do with it?

So, in short, no. Let’s shelve this dystopian technology for a few decades.

Share this post: