EU Commission Asks EU Council Lawyers If Compelled Client-Side Scanning Is Legal, Gets Told It Isn't:
Lots of ideas have been floated by legislators and others in hopes of limiting the distribution of child sexual abuse material (CSAM). Very few of these ideas have been good. Most have assumed that the problem is so horrendous any efforts are justified. The problem here is that governments need to actually justify mandated mass privacy invasions, which is something that they almost always can't do.
It's even a fraught issue in the private sector. Apple briefly proposed engaging in client-side scanning of users' devices to detect CSAM and prevent its distribution. This effort was put on hold when pretty much everyone objected to Apple's proposal, stating the obvious problems it would create — a list that included undermining the security and privacy protections Apple has long used as evidence of its superiority over competing products and their manufacturers.
Not that legislators appear to care. The EU Commission continues to move forward with "for the children" client-side scanning mandate, despite the multitude of problems this mandate would create. Last year, the proposal was ripped to shreds by the EU Data Protection Board and its supervisor in a report that explained the mandate would result in plenty of privacy invasion and data privacy law violations that simply could not be excused by the Commission's desire to limit the spread of CSAM.
[...] So, the proposal continues to move forward, ignoring pretty much every rational person's objections and the German government's flat-out refusal to enforce this mandate should it actually become law.
The Commission has ignored pretty much everyone while pushing this massive privacy/security threat past the legislative goal line. But it may not be able to ignore the latest objections to its proposal, given that they're being raised by the EU government's own lawyers.
[...] The legal opinion [PDF] makes it clear there's very little that's actually legal about compelled client-side scanning. The entire thing is damning, but here's just one of several issues the legal Council says the EU Commission is wrong about:
[...] A shotgun approach to CSAM detection is civil rights disaster waiting to happen, especially in cases where the government decides all users of a service are guilty just because some users are using the service to distribute illegal content.
The proposed legislation requires the general screening of the data processed by a specific service provider without any further distinction in terms of persons using that specific service. The fact that the detection orders would be directed at specific services where there is evidence of a significant risk of the service being used for the purpose of online child sexual abuse would be based on a connection between that service and the crimes of child sexual abuse, and not, even indirectly, on the connection between serious criminal acts and the persons whose data are scanned. The data of all the persons using that specific service would be scanned without those persons being, even indirectly, in a situation liable to give rise to criminal prosecutions, the use of that specific service being the only relevant factor in this respect.
And this would set off a chain of events that could easily result in permanent surveillance of millions of people's communications across multiple internet-based services. Not so much mission creep as mission sprint.
Furthermore, since issuing a detection order with regard to a specific provider of interpersonal communication services would entail the risk of encouraging the use of other services for child sexual abuse purposes, there is a clear risk that, in order to be effective, detection orders would have to be extended to other providers and lead de facto to a permanent surveillance of all interpersonal communications.
[...] The Council sums up its report by saying that if this proposal hopes to survive even the most cursory of legal challenges, it needs to vastly decrease its scope and greatly increase the specificity of its targeting. Otherwise, it's just a bunch of illegal surveillance masquerading as a child protection program. The Commission may be able to ignore security professionals and the occasional member state, but it seems unlikely it can just blow off its own lawyers.
(Score: 5, Interesting) by JustNiz on Thursday May 18, @06:12PM (1 child)
Is anyone here naive enough to truly believe that CSAM is what this is actually all about?
It seems to me that once in place, this mechanism is inevitably going to get extended to other uses than anti-CSAM (probably without the end-user's permission or even knowledge, until some hacker inevitably figures it out) and once again the mass majority of public will just keep buying products that have this enabled because they are gullible enough to believe it's actually about protecting kids.
Even Orwell didn't predict that citizens being spied on by governments would happily pay for the devices out of their own pockets and voluntarily install them in their own homes.
(Score: 3, Insightful) by deimtee on Thursday May 18, @09:36PM
Obviously, once the mechanism is in place, there will be a very strong push by the copyright industry to use it as well.
They will be keeping quiet at the moment, probable statement; "CSAM is very bad and we support every effort to catch those responsible. Who, us? Use it to catch those damn pirates? Never considered it."
No problem is insoluble, but at Ksp = 2.943×10−25 Mercury Sulphide comes close.