Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Monday October 11 2021, @08:00AM   Printer-friendly
from the too-late-now-we-have-masks dept.

MEPs support curbing police use of facial recognition:

Police should be banned from using blanket facial-recognition surveillance to identify people not suspected of crimes. Certain private databases of people’s faces for identification systems ought to be outlawed, too.

That's the feeling of the majority of members in the European Parliament this week. In a vote on Wednesday, 377 MEPs backed a resolution restricting law enforcement’s use of facial recognition, 248 voted against, and 62 abstained.

“AI-based identification systems already misidentify minority ethnic groups, LGBTI people, seniors and women at higher rates, which is particularly concerning in the context of law enforcement and the judiciary,” reads a statement from the parliament.

“To ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented, MEPs ask. Where possible, public authorities should use open-source software in order to be more transparent.”

MEP = Member of the European Parliament


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Insightful) by Anonymous Coward on Monday October 11 2021, @09:48AM (8 children)

    by Anonymous Coward on Monday October 11 2021, @09:48AM (#1186100)

    What the fuck?
    How is the damn machine to know which group you identify with? That's not even what the FR is trying to do. Fuck these completely dumbass faux-justifications. If you have a real reason, use that, and don't pile up garbage like this on top. That just makes the whole argument lame.

    And no, i don't want general facial-recognition around, that can just pinpoint my location at any time. But you can't just go to court with a printed note that says "AI says it's a match". There needs to be a secondary inspection of the results in any case.

    • (Score: 0) by Anonymous Coward on Monday October 11 2021, @11:02AM (1 child)

      by Anonymous Coward on Monday October 11 2021, @11:02AM (#1186104)

      If wokeness is the actual reason that surveillance gets curtailed, then it was good for once and only once.

      But let's not kid ourselves. One bad terrorist attack or a few scary anti-LGBTIQ2AP+ rallies, and facials are here to stay.

      • (Score: 0) by Anonymous Coward on Monday October 11 2021, @02:35PM

        by Anonymous Coward on Monday October 11 2021, @02:35PM (#1186140)

        Most of the times ends don't justify the means. Besides, do you want surveilance by officials or monitoring and judgment based on what ever random thing is trendy this month?

    • (Score: 2) by looorg on Monday October 11 2021, @11:52AM (2 children)

      by looorg (578) on Monday October 11 2021, @11:52AM (#1186106)

      “AI-based identification systems already misidentify minority ethnic groups, LGBTI people, seniors and women at higher rates, which is particularly concerning in the context of law enforcement and the judiciary,” reads a statement from the parliament.

      I stopped at that part to. Can a machine tell by just looking at these people what they are? Perhaps if you are some kind of xdresser but otherwise this seems a bit weird or a massive overstatement of capability. Or is it just if a male looks effeminate or a female looks very butch? But that doesn't mean they are homosexuals or whatnot.

      That said from that list one would say that white men women then be the once that are the biggest losers in the world of the surveillance society. We are the once that can get identified easily. After all we created the system and trained it on images that looked like us.

      • (Score: 1, Interesting) by Anonymous Coward on Tuesday October 12 2021, @01:47PM

        by Anonymous Coward on Tuesday October 12 2021, @01:47PM (#1186403)

        I think the idea is that, these systems, having been trained on young straight white men aren't particularly good at differentiating outside that training set.

        As such, it is more likely to miss obvious differences between the suspect and a person when the person is old, trans, a woman, or just not white. This is just how AI systems work, just as a plumber would be out of his element if you asked him to design a rocket engine, an AI trained on white guys will have a hard time differentiating between two Gypses.

        It isn't that the system was developed to be racist, just that the people developing the system accidently created it in a way that made it racist. This invisible, unintentional racism is one aspect of systematic racism.

      • (Score: 2) by FatPhil on Friday October 15 2021, @04:39AM

        by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Friday October 15 2021, @04:39AM (#1187206) Homepage
        The FR AI literature in the last 5 years has certainly contained pronouncements like "AI guesses sexuality from a photo better than humans can". That particular example hit an ethics committee, and they hit back pretty hard, saying "this could be used for bad, therefore must be outlawed", not understanding that that logic could be applied to phones, computers, cars, scissors, and humans.

        Meanwhile, China continues doing research in FR, and I hear has a pretty good Uighur detector now.
        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 0) by Anonymous Coward on Monday October 11 2021, @01:20PM

      by Anonymous Coward on Monday October 11 2021, @01:20PM (#1186121)

      But you can't just go to court with a printed note that says "AI says it's a match".

      "But mommie, I don't want to think, I want the machine to do it for me. Thinking is HARD!"

    • (Score: 2, Interesting) by Anonymous Coward on Monday October 11 2021, @03:55PM (1 child)

      by Anonymous Coward on Monday October 11 2021, @03:55PM (#1186173)

      If you have ubiquitous surveillance, you can map people's social graphs. From these social graphs, you can, with high accuracy, categorize individuals.

      There was a study in Turkey which used data from a data dump of the social network, Friendster to see if they could identify homosexual people using nothing but the person's social graph. They had an over 90% success rate.

      So, yes, this could be used to identify LGBT or, probably closer to home for you, Nazis.

      • (Score: -1, Flamebait) by Anonymous Coward on Tuesday October 12 2021, @07:32AM

        by Anonymous Coward on Tuesday October 12 2021, @07:32AM (#1186369)

        If you have this and that and those, the cows would fly. BUT WE ARE TALKING ABOUT FACIAL RECOGNITION HERE. Was i talking about all around surveilance, NO I WAS NOT AND IT WAS NOT THE TOPIC.

        Pretty fucking quick trigger finger you got there with that nazi card. You piece of shit know nothing about my life. You are the problem here with your agitation. Unlike you, i know what the topic is and i am a rational person.

  • (Score: -1, Troll) by Anonymous Coward on Monday October 11 2021, @12:08PM (1 child)

    by Anonymous Coward on Monday October 11 2021, @12:08PM (#1186108)

    Great on the editor for at least putting the definition of yet another stupid acronym on the story.

    Generally, the definition comes after the first use of it though, NOT at the end.

    Oh, and how does the AI "misidentify" LBGTI?
    Lesbo
    oops...maybe Trans. man
    BiSexual
    oops...an Eunuch
    Gay
    oops...sometimes, usually after drinking heavily.
    Queer
    oops...Spanish Catalan with heavy lisp.
    InterSex
    oops...really confused person of interest;

    Where does this shit fucking end?

    • (Score: 2) by maxwell demon on Monday October 11 2021, @06:06PM

      by maxwell demon (1608) on Monday October 11 2021, @06:06PM (#1186220) Journal

      That was a publication from Europe (and in particular a former EU country), written for people in the land of publication. Do you also complain when you see e.g. POTUS in an American article?

      Also the explanation was at the end because it was not part of the quoted article, but something added here on SN to help the non-European readers who might not be familiar with that abbreviation.

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 3, Funny) by Anonymous Coward on Monday October 11 2021, @01:39PM (2 children)

    by Anonymous Coward on Monday October 11 2021, @01:39PM (#1186126)

    https://1.bp.blogspot.com/-Us_G1Nk_Gvo/YWNntibxziI/AAAAAAAAuLs/DDEziPD3_gENb0ml-bPvyhWyHYD6OHLSACPcBGAsYHg/s998/Birds%2Baren%2527t%2Breal.png [blogspot.com]

    I'm not sure whose van that is, though. Azuma? Fusty? Maybe aristarchus? Help me out here, folks.

    • (Score: 3, Funny) by OrugTor on Monday October 11 2021, @04:37PM

      by OrugTor (5147) Subscriber Badge on Monday October 11 2021, @04:37PM (#1186188)

      Time to ramp up your van recognition software.

    • (Score: 1, Funny) by Anonymous Coward on Tuesday October 12 2021, @12:12AM

      by Anonymous Coward on Tuesday October 12 2021, @12:12AM (#1186318)

      What's an IREAL?

(1)