The European Commission’s CSAM legislation poses a huge threat to personal privacy

The scale of the legislation and what it seeks to regulate is in itself alarming. It is not about publicly available content online, it is about Europeans’ private messages

The European Commission wants your messaging provider to spy on you, with their proposed new anti-encryption legislation requiring providers of messaging systems in the EU to scan messages of users after receiving a “detection order” from a European law enforcement agency. The EC argues we need this to combat the proliferation of child sexual abuse material (CSAM) online, but this comes with far wider spying concerns.

It also establishes a new EU Centre to coordinate law enforcement’s efforts in fighting child sexual abuse online. This new European centre would coordinate between European law enforcement agencies, Europol, member states, messaging providers and victims of sexual abuse. It would serve as a clearing house for known images of abusive content and assist in developing techniques for identifying it in online messaging systems. The EC suggest that this EU Centre being founded in The Hague in order to benefit from being close to Europol.

The scale of this legislation and precisely what it seeks to regulate is in itself alarming. It is not about publicly available content online, it is about Europeans’ private messages. This legislation is about sifting through people’s emails and any other messaging applications they use, services like WhatsApp, Telegram, Facebook Messenger, Twitter and others. Messaging platforms where users typically have a reasonable expectation of privacy.

The EC imagines much of the actual work of identifying messages containing child sexual abuse as being done by machines, not by humans, with the inherent assumption that artificial intelligence algorithms will deftly distinguish between what is CSAM and what is not. This in itself is pure fantasy. Worse yet, the legislation pretends that computer magic will be able to identify “grooming” of children by potential predators. Thus, not only will computers magically distinguish between parents sending a picture of their children to family and someone sending that same picture to a predator, but computers will be able to identify when an adult is grooming a child in order to engage in potential future crimes. All this when computers can’t even tell how old someone is in a photo, or reliably distinguish between a picture of a sloth and a picture of a racecar.

The way this is all supposed to work is that the EU Centre, in cooperation with other relevant EU law enforcement agencies, will send a “detection order” to a messaging service provider. This messaging service provider could be a large company, like Facebook or Twitter. Or it could be a small forum operator, or DiEM25 itself, since DiEM25 operates multiple messaging platforms.

Once received, the provider is required to implement detection technologies that are “effective in detecting the dissemination of known or new child sexual abuse material or the solicitation of children”. It is then largely up to the provider to make sure that these technologies effectively identify CSAM, do not extract more information than necessary for the detection order, and don’t infringe on their users too much. These providers are required to tell their users that they’re being spied on, but not which users are being spied on. None of this relieves the provider from other European privacy laws, including the regulations laid out in the General Data Protection Regulation (GDPR).

Since there is no way this can be reliably handled in an automated fashion, this will require intervention and content evaluation by humans, most likely by employees of the messaging providers themselves. This requires employees of messaging providers be able to read their users’ messages. This legislation is great if you’re a large messaging provider that already spies on its users, and terrible for small operators who now promise to keep their users safe from spying by storing users’ encrypted messages.

This is why in our Technology Sovereignty paper we call for strengthening privacy regulation in the EU, and restricting state-enabled corporate surveillance of the public. (Section 2.1.1) EU states should not promote spying on their citizens through corporate messaging platforms.

We should not forget that when it comes to invading our privacy law enforcement just can’t help itself. Typical of legislation designed to take away our rights, the EC has given us a victim, a bad guy, and cast themselves as the hero who saves the day. They didn’t even bother coming up with a new bad guy. The Four Horsemen of the Infocalypse lists the four typical bad guys that legislators deploy when wanting to deprive us of our rights: terrorists, drug dealers, paedophiles, and organised crime. They chose number three this time. Maybe next time they’ll choose a different one.

Do you want to be informed of DiEM25's actions? Sign up here

Migration policy that puts people first: Solutions for inclusive societies

We’re proud to unveil our latest policy paper, offering a bold vision for Europe on migration

Read more

No to NATO. Yes to an independent Cyprus

The way to a peaceful and secure future does not lie in closer integration with NATO or other imperialist alliances

Read more

Peace for Syria. Not a piece of Syria

The international community must finally confront the tragic reality of Syria: a nation treated as a chessboard by global powers

Read more

Lessons from Syria: An imperialist’s enemy is not always an anti-imperialist’s friend

To fight imperialism and win in the long run, we must win the hearts and minds of people. And we cannot do this by supporting tyrants

Read more