Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Saturday May 12 2018, @10:56AM   Printer-friendly
from the no-more-safety-in-numbers dept.

Submitted via IRC for SoyCow4408

Thousands of attendees of the 2017 Champions League final in Cardiff, Wales were mistakenly identified as potential criminals by facial recognition technology used by local law enforcement.

According to the Guardian, the South Wales police scanned the crowd of more than 170,000 people who traveled to the nation’s capital for the soccer match between Real Madrid and Juventus. The cameras identified 2,470 people as criminals.

Having that many potential lawbreakers in attendance might make sense if the event was, say, a convict convention, but seems pretty high for a soccer match. As it turned out, the cameras were a little overly-aggressive in trying to spot some bad guys. Of the potential criminals identified, 2,297 were wrongly labeled by the facial recognition software. That’s a 92 percent false positive rate.

Source: Facial Recognition Used by Wales Police Has 90 Percent False Positive Rate


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by looorg on Saturday May 12 2018, @12:02PM (15 children)

    by looorg (578) on Saturday May 12 2018, @12:02PM (#678791)

    Is it really that many? If you have a crowd of 170k people and it spots 2470 as criminals (or any other selecting factor) that is just about 1.4% of the people in the crowd. Is that really to many or an unrealistic amount of them? It's a soccer game after all, there is bound to be thousands of hooligans in there that would be registered.

    So apparently then 93% of the spotted (it's really a lot closer to 93 then 92) was mistakes when compared or rechecked later -- how and by whom? Did they just recheck those 2470 people by hand or did they let the machine take it's sweet time? So there really needs to be some fine tuning.

    On the other hand IF you have an international soccer game in Wales between Real Madrid (Spanish) and Juventus (Italian) one can assume a fair amount of fans coming from those two countries to watch -- did they get face-data of criminals from those countries? Was it just a matter of them looking dark(er) then most Welsh people and there for foreign and suspect (there have been umpteen news about how face-recognition numbers apparently get inflated by black and or dark or yellow faces compared to white faces). It takes a certain type of fanatic to travel across Europe for a soccer game with your team so they are probably soccer hooligans.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 5, Informative) by bzipitidoo on Saturday May 12 2018, @12:53PM (3 children)

    by bzipitidoo (4388) on Saturday May 12 2018, @12:53PM (#678804) Journal

    I worked on this problem over 10 years ago. Law enforcement, security, and others has been wanting this for a long time, but don't seem to appreciate the difficulties. They have on the order of a million mugshots covering the entire nation, so yes, that is about 1.5% of the population who has ever been accused of a crime serious enough to warrant a mugshot. Of course not all of the accused were guilty or convicted, but you know how law enforcement is. They never throw any photos out, because everyone is a potential criminal.

    A major difficulty is the sheer quantity. 93% false positive rate? Suppose the facial recognition technique was so good that it was correct 99.99% of the time (a decade ago, the best was struggling to achieve a mere 90%) when asked to compare just two pictures and give a yes/no answer to the question, are these two photos of the same person? That means when comparing one photo to a database of a million, it could still pick out 100 people from the database who look like the photo it is checking. Obviously, at most only one person is a match. But this problem is worse than that. It's attempting to find matches between a group of some 100,000 photos with a separate group of 1,000,000 photos. That's some serious Birthday Paradox kind of trouble. Yes, software can be better at this task than the best human observers, better than a whole room full of "I never forget a face" people, and still struggle to manage such enormous quantities. That they got only 2470 false positives is, frankly, very impressive, implies the software is several 9s better than even 99.99%, though maybe still not good enough for regular use.

    • (Score: 3, Insightful) by requerdanos on Saturday May 12 2018, @04:07PM (1 child)

      by requerdanos (5997) Subscriber Badge on Saturday May 12 2018, @04:07PM (#678854) Journal

      That they got only 2470 false positives is, frankly, very impressive

      Context is key here.

      Wow, in the research realm, that is very impressive--instead of sucking a truckload of metric megatons, it only sucks a couple of metric megatons! Very, very good, quantifiable improvement. Well-done! In a century or less this technology might be useful not only under carefully controlled lab condition, but maybe, possibly, in the field!

      though maybe still not good enough for regular use.

      nod. They aren't doing research and comparing numbers. They are running this against actual people who have done nothing wrong to raise suspicion that they might be targeted in this way (just being in public doesn't count), that cops who might or might not have their best interests at heart (hint: several 9s in the "not" direction) might potentially interfere with them and cause problems with them.

      The article says that the cops in question have been very diligent about checking to make sure the matches are correct and not made false arrests based on the system. Might sound great, but that means that they are investigating thousands of innocent people.

      From TFA:

      "...not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool" ... It also makes innocent people into suspects, entirely unbeknownst to them, until police can clear their name.

      That's bad news all around.

      Look, I get it, I see the seductive lure of an automated system under which an alarm goes off "Ding!" at the security station with a message "Bill McEvil Spotted! 8 outstanding warrants! Section A-3" but I shudder at the freedom-removing big-brother aspect of it.

      • (Score: 2) by bzipitidoo on Saturday May 12 2018, @07:03PM

        by bzipitidoo (4388) on Saturday May 12 2018, @07:03PM (#678905) Journal

        Hey, I agree that the police shouldn't be given free rein to go on fishing expeditions of any sort. No, they shouldn't be allowed to examine your smart phone's memory when you cross the border. No, they don't get to set up road blocks and stop everyone passing by. Just last summer I was caught twice in road blocks on I10 (with I8, I10 is the closest major highway that runs parallel to the Mexican border) by the border patrol, asking everyone if they were US citizens. I suspect they were encouraged to do that by a certain president who wants to build a big wall along that border. Or they have long standing permission to do that within 100 miles of the border. But it was still a violation of our civil liberties. Have to use I40 further to the north to be safe from that particular annoyance. On another road block, it turned out the local police had taken the initiative, and didn't actually have permission or sanction from the city. Yet they had the cheek to set up a road block on not just a modest highway, but the Interstate. When citizens started asking them who had authorized the affair, they hurriedly packed up the road blocks and cleared out.

        Like road blocks, this photo identification of everyone shouldn't be allowed no matter how much less obtrusive it is. The road block is seriously obnoxious in that it can quickly cause a huge traffic jam, which taking a bunch of photos need not do. Yet the main danger is the inversion of the innocent until proven guilty principle and police mentality in combination with not understanding the limitations of their tools. Like, I read that for many years Houston misused a flawed drug test on motorists they pulled over for traffic violations, and put a lot of innocent people in prison. They made it real simple. If the magic test tube turned the magic color, the substance was an illegal drug, end of discussion. Except that it would give all kinds of false positives for various routine over-the-counter drugs, and other innocent substances, and the police ignored those inconvenient, complicating details. Many of the accused were unfairly pressured into admitting guilt and taking bad plea bargains, ground up by the legal machinery that values appearances of infallibility and tough on crime and counts scoring convictions like a baseball team wants to score runs.

    • (Score: 2) by MichaelDavidCrawford on Sunday May 13 2018, @03:18AM

      I once stumbled into a domestic violence incident and so called 911 as the culprit walked away

      I was unable to remember the color of his clothing or whether he wore a hat

      All I could remember was that the victim said he had a gun

      --
      Yes I Have No Bananas. [gofundme.com]
  • (Score: 2) by JoeMerchant on Saturday May 12 2018, @01:24PM (6 children)

    by JoeMerchant (3937) on Saturday May 12 2018, @01:24PM (#678810)

    So, is it better to have to check 10 faces with wetware per criminal match, or 100 faces with wetware per criminal match?

    As long as the "potentials" are not disturbed in any way until they're confirmed by wetware, this isn't any different than employing 10x as many cops to watch cameras.

    --
    🌻🌻 [google.com]
    • (Score: 4, Informative) by frojack on Saturday May 12 2018, @04:36PM (5 children)

      by frojack (1554) on Saturday May 12 2018, @04:36PM (#678860) Journal

      As long as the "potentials" are not disturbed in any way

      Read ArsTechnica's report on the same issue:
      https://arstechnica.com/tech-policy/2018/05/uk-police-say-92-percent-false-positive-facial-recognition-is-no-big-deal/ [arstechnica.com]

      You sound a lot like the police spokesman:

      "However, since we introduced the facial recognition technology, no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained."
      ...
      In a public statement, the SWP said that it has arrested "over 450" people as a result of its facial recognition efforts over the last nine months.

      Parse it carefully, and you see things are exactly the opposite of what you suppose.

      False positive's do lead to "intervention".
      Hauled in for questioning, detained, cuffed? ID demanded? What exactly is an intervention? How would they know it was a false positive unless they demanded identification?

      All we know is there is HIS word that the subject was not ultimately arrested, and all were intimidated into not filing a complaint.

      How do you feel about NYC stop and frisk policy? Most of those lead to no arrest, after all. Its the same thing, except the NYC cop probably has a LOT more reason for suspicion than the distance between someone's eyes.

      Is there a record on those "intervention" following these people around. Do they all "have form" now?
      What about being hauled off for questioning, required to show ID in front of friends and family?
      Will there be a public statement of exoneration?

      How many of those 450 arrested people subsequently were not prosecuted, or not convicted for lack of evidence?
      Present evidence would suggest somewhere in the neighborhood of 425.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 2) by takyon on Saturday May 12 2018, @05:53PM (1 child)

        by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday May 12 2018, @05:53PM (#678889) Journal

        False positive's

        You have the right to remain silent. Any apostrophe you type can and will be used against you in a court of public opinion.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by frojack on Saturday May 12 2018, @06:11PM

          by frojack (1554) on Saturday May 12 2018, @06:11PM (#678890) Journal

          I reserve my best punctuation for people who pay me, your Honor.

          Careful parsing suggests you successfully avoided Muphry's law. Congratulations.

          --
          No, you are mistaken. I've always had this sig.
      • (Score: 2) by JoeMerchant on Saturday May 12 2018, @07:49PM (2 children)

        by JoeMerchant (3937) on Saturday May 12 2018, @07:49PM (#678920)

        I'm not in favor of the technology, but I don't think anyone will be stopping it except the judges who might give a rather biased view of collars that started from mass surveillance - to an extent this has been happening in the US with traffic cams, in many jurisdictions all you have to do to have a traffic cam ticket nullified is take it to court (in others, not so much, YMMV.)

        This tech is like giving police a huge boost in their budget for a fraction of the price. Newsflash: if we the people really wanted more police, we'd already be electing politicians who would be paying for more police. A huge shift in police budget is a huge shift in personal liberty. Perhaps, if police can deploy this tech effectively, without trampling the rights of the individuals any more than they already do, we can use it to shrink police budgets while keeping the same level of effectiveness. Now, all those would-have-been-cops can be out of work alongside the rest of the robot displaced workforce. UBI anyone?

        --
        🌻🌻 [google.com]
        • (Score: 2) by frojack on Saturday May 12 2018, @10:30PM (1 child)

          by frojack (1554) on Saturday May 12 2018, @10:30PM (#678960) Journal

          Perhaps, if police can deploy this tech effectively, without trampling the rights of the individuals any more than they already do, we can use it to shrink police budgets while keeping the same level of effectiveness.

          But you really can't employ it effectively without more police. Somebody has to follow up all the "hits", and they have to do it very quickly, or the hit will be gone from the scene.

          So more cops.
          (And that assumes the perfect case, with NO false positives).

          With false positives it amounts to just another reason cops with lots of guns can show up at anyone's door with the lame excuse that their "finely tuned" RECO system spotted a person walking down the street.

          Is it worth it?

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 2) by JoeMerchant on Saturday May 12 2018, @11:53PM

            by JoeMerchant (3937) on Saturday May 12 2018, @11:53PM (#678980)

            But you really can't employ it effectively without more police. Somebody has to follow up all the "hits", and they have to do it very quickly, or the hit will be gone from the scene.

            Um.... versus the scenario where police just don't look for bad guys, ever... sure.

            I'm comparing this to a scenario where 100 cops are assigned to patrol the stadium for a big game and they've all seen the wanted posters, etc. So, instead of 200 eyes scanning 100,000 faces for known bad guys, you've got cameras, and maybe 20 eyes doing followup as directed from the computer dispatch, while the other 80 can be more effective doing non-scanning for bad guys stuff than the whole 100 would have been if they're all trying to get a look at 1000 faces each (or whatever the more realistic ratios are for your stadium, do you have a president or congresscritter in attendance? If so, definitely amp up the blue ratio.)

            As for NO false positives, where did that come from? In my younger days I got profiled and detained more than once, and what they usually did was ask for ID and radio me in to check my DL # against wanted lists and bench warrants. So, if this system can push some of that work up front to not bother every white male early 20s approx 6' tall with dark hair and only temporarily detain 10 for every 1 actual bad guy, from my perspective, that's a huge improvement over the status quo.

            Anything can be abused. Cops around here don't need no steenking computerized facial recog system to detain your ass for 30 minutes or more just because you sort of fit a profile they are looking for, and I even had one give me a false "ran red light" ticket to justify holding me for over 30 minutes while they sorted out if I was who they were looking for or not - he knew it was bogus, but this being the days before everybody had a video recorder running 24-7, he was comfortable telling me "you can take it to court, but when we get there it'll be your word against mine, and I'm a cop..."

            So... can it be abused? Sure. Do I think that some cops will abuse it horribly? Absolutely. Do I think that ALL cops will abuse it? No, and overall it might actually be a net positive for most people. If the cop using the system has any training or experience and knows that there's a 93% false positive rate, he/she should be pretty damned considerate and polite with people until some additional confirmation can be had, and after I got out of my 20s my experience has been that most cops really aren't assholes drunk on power, most of them are actually trying to help most people - by arresting the people that need arresting, and honestly trying not to screw up anybody's life who doesn't deserve it. However, for a small minority of cops, punk-ass kids with long hair apparently deserve it even when they're not doing (and haven't done) anything wrong.

            --
            🌻🌻 [google.com]
  • (Score: 2) by ledow on Saturday May 12 2018, @02:57PM (1 child)

    by ledow (5567) on Saturday May 12 2018, @02:57PM (#678835) Homepage

    I couldn't find stats for the UK, but:

    "New research on the growth in the scope and scale of felony convictions finds that, as of 2010, 3 percent of the total US population... have served time in prison"

    If that's at all translateable, it's pretty terrible.

    They would literally do better by random chance.

    • (Score: 0) by Anonymous Coward on Saturday May 12 2018, @10:23PM

      by Anonymous Coward on Saturday May 12 2018, @10:23PM (#678958)

      Afro-Americans make up 30% of the population and your typical Afro-American has a 10% probability of having served time in prison... do the math and it makes you wonder what we'd do with all the empty prisons if they all went back to Africa.

  • (Score: 2) by Whoever on Saturday May 12 2018, @05:05PM (1 child)

    by Whoever (4524) on Saturday May 12 2018, @05:05PM (#678874) Journal

    It's not a 90% false positive rate. It's more like 1.5%.

    https://en.wikipedia.org/wiki/False_positive_rate [wikipedia.org]

    • (Score: 2) by requerdanos on Saturday May 12 2018, @09:33PM

      by requerdanos (5997) Subscriber Badge on Saturday May 12 2018, @09:33PM (#678942) Journal

      It's not a 90% false positive rate.

      Right you are. It's correct to say instead that 92% of the criminal identifications were false positives.