Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Monday October 11 2021, @08:00AM   Printer-friendly
from the too-late-now-we-have-masks dept.

MEPs support curbing police use of facial recognition:

Police should be banned from using blanket facial-recognition surveillance to identify people not suspected of crimes. Certain private databases of people’s faces for identification systems ought to be outlawed, too.

That's the feeling of the majority of members in the European Parliament this week. In a vote on Wednesday, 377 MEPs backed a resolution restricting law enforcement’s use of facial recognition, 248 voted against, and 62 abstained.

“AI-based identification systems already misidentify minority ethnic groups, LGBTI people, seniors and women at higher rates, which is particularly concerning in the context of law enforcement and the judiciary,” reads a statement from the parliament.

“To ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented, MEPs ask. Where possible, public authorities should use open-source software in order to be more transparent.”

MEP = Member of the European Parliament


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Tuesday October 12 2021, @01:47PM

    by Anonymous Coward on Tuesday October 12 2021, @01:47PM (#1186403)

    I think the idea is that, these systems, having been trained on young straight white men aren't particularly good at differentiating outside that training set.

    As such, it is more likely to miss obvious differences between the suspect and a person when the person is old, trans, a woman, or just not white. This is just how AI systems work, just as a plumber would be out of his element if you asked him to design a rocket engine, an AI trained on white guys will have a hard time differentiating between two Gypses.

    It isn't that the system was developed to be racist, just that the people developing the system accidently created it in a way that made it racist. This invisible, unintentional racism is one aspect of systematic racism.

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1