MEPs support curbing police use of facial recognition:
Police should be banned from using blanket facial-recognition surveillance to identify people not suspected of crimes. Certain private databases of people’s faces for identification systems ought to be outlawed, too.
That's the feeling of the majority of members in the European Parliament this week. In a vote on Wednesday, 377 MEPs backed a resolution restricting law enforcement’s use of facial recognition, 248 voted against, and 62 abstained.
“AI-based identification systems already misidentify minority ethnic groups, LGBTI people, seniors and women at higher rates, which is particularly concerning in the context of law enforcement and the judiciary,” reads a statement from the parliament.
“To ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented, MEPs ask. Where possible, public authorities should use open-source software in order to be more transparent.”
MEP = Member of the European Parliament
(Score: 2) by looorg on Monday October 11 2021, @11:52AM (2 children)
I stopped at that part to. Can a machine tell by just looking at these people what they are? Perhaps if you are some kind of xdresser but otherwise this seems a bit weird or a massive overstatement of capability. Or is it just if a male looks effeminate or a female looks very butch? But that doesn't mean they are homosexuals or whatnot.
That said from that list one would say that white men women then be the once that are the biggest losers in the world of the surveillance society. We are the once that can get identified easily. After all we created the system and trained it on images that looked like us.
(Score: 1, Interesting) by Anonymous Coward on Tuesday October 12 2021, @01:47PM
I think the idea is that, these systems, having been trained on young straight white men aren't particularly good at differentiating outside that training set.
As such, it is more likely to miss obvious differences between the suspect and a person when the person is old, trans, a woman, or just not white. This is just how AI systems work, just as a plumber would be out of his element if you asked him to design a rocket engine, an AI trained on white guys will have a hard time differentiating between two Gypses.
It isn't that the system was developed to be racist, just that the people developing the system accidently created it in a way that made it racist. This invisible, unintentional racism is one aspect of systematic racism.
(Score: 2) by FatPhil on Friday October 15 2021, @04:39AM
Meanwhile, China continues doing research in FR, and I hear has a pretty good Uighur detector now.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves