Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

EU Commission Asks EU Council Lawyers If Compelled Client-Side Scanning is Legal, Gets Told It Isn’t

Accepted submission by upstart at 2023-05-16 13:57:42
News

████ # This file was generated bot-o-matically! Edit at your own risk. ████

EU Commission Asks EU Council Lawyers If Compelled Client-Side Scanning Is Legal, Gets Told It Isn’t [techdirt.com]:

EU Commission Asks EU Council Lawyers If Compelled Client-Side Scanning Is Legal, Gets Told It Isn’t

(Mis)Uses of Technology [techdirt.com]

from the Welcome-to-Lawsuit-Town,-commissioners dept

Lots of ideas have been floated by legislators and others in hopes of limiting the distribution of child sexual abuse material (CSAM). Very few of these ideas have been good. Most have assumed that the problem is so horrendous any efforts are justified. The problem here is that governments need to actually justify mandated mass privacy invasions, which is something that they almost always can’t do.

It’s even a fraught issue in the private sector. Apple briefly proposed engaging in client-side scanning of users’ devices to detect CSAM and prevent its distribution. This effort was put on hold [techdirt.com] when pretty much everyone [techdirt.com] objected to Apple’s proposal, stating the obvious problems it would create — a list that included undermining the security and privacy protections [techdirt.com] Apple has long used as evidence of its superiority over competing products and their manufacturers.

Not that legislators appear to care. The EU Commission continues to move forward with “for the children” client-side scanning mandate, despite the multitude of problems this mandate would create. Last year, the proposal was ripped to shreds [europa.eu] by the EU Data Protection Board and its supervisor in a report that explained the mandate would result in plenty of privacy invasion and data privacy law violations that simply could not be excused by the Commission’s desire to limit the spread of CSAM.

In defense of the decimated proposal, EU Commussioner for Home Affairs, Yiva Johansson, penned an absurd, incomprehensible defense [techdirt.com] of the Commission’s client-side scanning proposal that basically stated this proposal was justified by the ends, while glossing over the potentially disastrous means.

So, the proposal continues to move forward, ignoring pretty much every rational person’s objections [techdirt.com] and the German government’s flat-out refusal [techdirt.com] to enforce this mandate should it actually become law.

The Commission has ignored pretty much everyone while pushing this massive privacy/security threat past the legislative goal line. But it may not be able to ignore the latest objections to its proposal, given that they’re being raised by the EU government’s own lawyers [techcrunch.com].

A legal opinion on a controversial European Union legislative plan set out last May [techcrunch.com], when the Commission proposed countering child sexual abuse online by applying obligations on platforms to scan for abuse and grooming, suggests the planned approach is incompatible with existing EU laws that prohibit general and indiscriminate monitoring of people’s communications.

The advice by the Council’s legal service on the proposed Child Sexual Abuse Regulation (also sometimes referred to as “Chat control”), which leaked online [twitter.com] this week — and was covered by The Guardian [theguardian.com] yesterday — finds the regulation as drafted to be on a collision course with fundamental European rights like privacy and data protection; freedom of expression; and the right to respect for a private family life, as critics have warned from the get-go.

Security consultant and cryptographer Alec Muffett [twitter.com] posted a link to the leaked legal opinion to Twitter while highlighting the Council’s numerous statements that the proposal has zero chance of being found legal under EU law, much less a proportionate response to the stated problem that might be considered serious enough to allow a massive number of rights violations by the EU government.

The legal opinion [documentcloud.org] [PDF] makes it clear there’s very little that’s actually legal about compelled client-side scanning. The entire thing is damning, but here’s just one of several issues the legal Council says the EU Commission is wrong about:

The screening of interpersonal communications as a result of the issuance of a detection order undeniably affects the fundamental right to respect for private life, guaranteed in Article 7 of the Charter, because it provides access to and affects the confidentiality of interpersonal communications (text messages, e-mails, audio conversations, pictures or any other kind of exchanged personal information). It is also likely to have a deterrent effect on the exercise of freedom of expression, which is enshrined in Article 11 of the Charter. It does not matter in this respect whether the information in question relating to private life is sensitive or whether the persons concerned have been inconvenienced in any way on account of that interference.

Furthermore, such screening constitutes the processing of personal data within the meaning of Article 8 of the Charter and affects the right to protection of personal data provided by that provision.

It must be noted, in this respect, that under settled case law the fact that the automated analysis based on predefined indicators would not, as such, allow all the users whose data is being analysed to be identified, does not prevent such data from being considered personal data, in so far as the automated analysis would allow the person or persons concerned by the data to be identified at a later stage. According to the definition of personal data in Article 4(1) of the General Data Protection Regulation (GDPR), information relating, inter alia, to an identifiable person constitutes personal data. Therefore, screening of all communications in a given service, with the assistance of an automated operation, presupposes systematic access to and processing of all information and constitutes an interference with the right to data protection, regardless of how that data is used subsequently. In particular, the question whether that information is subsequently accessed by the competent authorities is irrelevant.

A shotgun approach to CSAM detection is civil rights disaster waiting to happen, especially in cases where the government decides all users of a service are guilty just because some users are using the service to distribute illegal content.

The proposed legislation requires the general screening of the data processed by a specific service provider without any further distinction in terms of persons using that specific service. The fact that the detection orders would be directed at specific services where there is evidence of a significant risk of the service being used for the purpose of online child sexual abuse would be based on a connection between that service and the crimes of child sexual abuse, and not, even indirectly, on the connection between serious criminal acts and the persons whose data are scanned. The data of all the persons using that specific service would be scanned without those persons being, even indirectly, in a situation liable to give rise to criminal prosecutions, the use of that specific service being the only relevant factor in this respect.

And this would set off a chain of events that could easily result in permanent surveillance of millions of people’s communications across multiple internet-based services. Not so much mission creep as mission sprint.

Furthermore, since issuing a detection order with regard to a specific provider of interpersonal communication services would entail the risk of encouraging the use of other services for child sexual abuse purposes, there is a clear risk that, in order to be effective, detection orders would have to be extended to other providers and lead de facto to a permanent surveillance of all interpersonal communications.

The opinion also notes that weakening encryption to achieve this goal would result in more interference with citizens’ rights by making it that much more difficult to ensure their own data privacy and preventing providers from taking the steps needed to ensure the security of data collected from users. Collection of written communications (that contain no images) in order to detect those trying to lure children into CSAM creation creates another long list of data privacy law violations by introducing even more surveillance in hopes of ascertaining the ages of those engaged in these communications.

And, while CSAM is definitely a problem, it’s not on par with the security of EU nations, which is what justifies most exceptions to EU privacy laws. But these two issues are not comparable, no matter how EU Commissioners portray CSAM when advocating for mass privacy violations. The Council notes that exceptions that have been made in the interest of national security are unlikely to apply to CSAM detection, especially when the latter is asking for something even intelligence agencies haven’t been able to legally obtain. (Emphasis in the original.)

[I]f the screening of communications metadata was judged by the Court proportionate only for the purpose of safeguarding national security, it is rather unlikely that similar screening of content of communications for the purpose of combating crime of child sexual abuse would be found proportionate, let alone with regard to the conduct not constituting criminal offences.

The Council sums up its report by saying that if this proposal hopes to survive even the most cursory of legal challenges, it needs to vastly decrease its scope and greatly increase the specificity of its targeting. Otherwise, it’s just a bunch of illegal surveillance masquerading as a child protection program. The Commission may be able to ignore security professionals and the occasional member state, but it seems unlikely it can just blow off its own lawyers.


Original Submission