Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday July 28 2018, @06:50PM   Printer-friendly
from the they're-criminals dept.

The American Civil Liberties Union, in an effort to demonstrate the dangers of face recognition technology, ran photos of members of Congress against a database of mug shots using Amazon Rekognition software. That test incorrectly identified 28 legislators as criminals (cue the jokes - yes, the Congress members were confirmed to be elsewhere at the time). They hope that demonstrating that this risk hits close to home will get Congress more interested in regulating the use of this technology.

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

[...] If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by The Shire on Sunday July 29 2018, @04:57AM (12 children)

    by The Shire (5824) on Sunday July 29 2018, @04:57AM (#714238)

    People are deliberatly conflating the concept of a filter with an ID.

    When you enter search terms into an engine you don't expect to get back one exact match for what you were looking for. You expect to get a list of narrowed down results which you then parse using your own judgement. Facial recognition is the same way. It's not intended to "finger the bad guy". It's intended to narrow the field to reduce the workload on the investigator who then uses their judgement to decide who in the returned list, if anyone, is who they were looking for.

    And it's deceptive to use such a small sample size to train the system on and then select an 80% confidence level and expect to get a precise result. 95% confidence would be more appropriate and I have no doubt they tried that and didn't get the result they wanted so they reduced it until they got the false positives they wanted for a headline.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 4, Interesting) by janrinok on Sunday July 29 2018, @07:09AM (3 children)

    by janrinok (52) Subscriber Badge on Sunday July 29 2018, @07:09AM (#714253) Journal

    I have no doubt they tried that and didn't get the result they wanted

    And you have no doubt that they did this because you have evidence of it, or simply because it fits your own point of view? It might have happened, but there is nothing to support that speculative statement. You appear to be just as guilty at spinning a story in the same way that you are accusing the ACLU of doing.

    • (Score: 2) by The Shire on Sunday July 29 2018, @01:12PM (2 children)

      by The Shire (5824) on Sunday July 29 2018, @01:12PM (#714338)

      If the ACLU was only interested in accuracy they would have properly portrayed that this tool is only used to narrow the fiekd, not as a way to ID a criminal. It implies strongly they are being intentionally deceptive to further their own agenda. I dont think my comment spins that at all.

      • (Score: 3, Insightful) by number11 on Sunday July 29 2018, @07:56PM (1 child)

        by number11 (1170) Subscriber Badge on Sunday July 29 2018, @07:56PM (#714442)

        If the ACLU was only interested in accuracy they would have properly portrayed that this tool is only used to narrow the fiekd, not as a way to ID a criminal. It implies strongly they are being intentionally deceptive to further their own agenda.

        As does the approach of thinking that no policeman will ever act as if it is a way to ID a criminal. The same way that policemen believe that drug dogs are always right, and that bargain-basement field tests for drugs are accurate.

        The false positive rate is 5%. That means that one out of every 20 people passing the cop will be treated as a criminal. And it will probably happen repeatedly to the same 5%.

        • (Score: 2) by The Shire on Monday July 30 2018, @01:24AM

          by The Shire (5824) on Monday July 30 2018, @01:24AM (#714558)

          When the confidence of a neural net is set to 80% I can promise you the false positive rate is MUCH higher than 5%. Again, they're using this tool incorrectly. It's a filter not a pointer. It returns a large set of possible matches that a human must then sift through and decide which if any is an actual match. No one out there is using this for traffic stops expecting it to be a positive ID. The ACLU is making claims that a system is failing to do what it was never designed or intended to do.

          If you give me a hammer and I then run around smashing car windows, you don't blame the hammer for being destructive, you blame the user for using the tool incorrectly. That's the case here as well. This software is not intended to positively ID anyone, nor is it offered as something that does so.

  • (Score: 3, Informative) by c0lo on Sunday July 29 2018, @10:48AM (6 children)

    by c0lo (156) Subscriber Badge on Sunday July 29 2018, @10:48AM (#714293) Journal

    And it's deceptive to use such a small sample size to train the system on and then select an 80% confidence level and expect to get a precise result.

    Actually "80% accuracy" setting is the Amazon's default. Wanna bet your average doughnut munching policeman won't use other settings? TFA

    Using Rekognition, we built a face database and search tool using 25,000 publicly available arrest photos. Then we searched that database against public photos of every current member of the House and Senate. We used the default match settings that Amazon sets for Rekognition.

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 2) by The Shire on Sunday July 29 2018, @01:08PM (5 children)

      by The Shire (5824) on Sunday July 29 2018, @01:08PM (#714335)

      You're missing the point of my post: it's a filter. Even wt 80% it generates a narrowed list of subjects to investigate, it doesnt point the finger at one person. This is a tool not a jury. The ACLU is deliberately misleading the public about how this is used.

      • (Score: 2, Insightful) by Anonymous Coward on Sunday July 29 2018, @01:52PM (1 child)

        by Anonymous Coward on Sunday July 29 2018, @01:52PM (#714347)

        The people with the most skin in the game to deliberately mislead the public about how this will be used are Amazon and law enforcement. Unless we keep a close watch on them, this will be used as a pointer not a filter.

        • (Score: 1, Insightful) by Anonymous Coward on Sunday July 29 2018, @03:41PM

          by Anonymous Coward on Sunday July 29 2018, @03:41PM (#714377)

          Yup! And it wont stop innocent people from being harassed or worse.

      • (Score: 3, Insightful) by c0lo on Sunday July 29 2018, @09:30PM (2 children)

        by c0lo (156) Subscriber Badge on Sunday July 29 2018, @09:30PM (#714465) Journal

        You're missing the point of my post: it's a filter.

        it will be used as a pointer.
        Step into the policeman shoes, with limited (objectively or subjectively) effort capacity:
        - without the tech, he needs to search for other clues first and apply the "looks like" filter later
        - with the tech, he'll get a list of persons "identified" as possible suspects - he'll very likely started to work with this list,

        The ACLU is deliberately misleading the public about how this is used.

        Really? Is it already used?
        If not, how can you be so sure?

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 2) by HiThere on Sunday July 29 2018, @11:42PM

          by HiThere (866) Subscriber Badge on Sunday July 29 2018, @11:42PM (#714507) Journal

          FWIW, it *is* already used. (Possibly not exactly this software.) I don't know that it's used by police, but I believe that it was yesterday or the day before that there was news on Soylent that two Canadian Malls were using it, and that the owners of those malls claimed that "others were using it". It's true they also claimed that they didn't track the data that would allow individuals to be identified. Believe them if you want to, it might be true.

          --
          Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 2) by The Shire on Monday July 30 2018, @01:27AM

          by The Shire (5824) on Monday July 30 2018, @01:27AM (#714560)

          Yes, really, it's already being used:

          https://www.npr.org/2018/05/22/613115969/orlando-police-testing-amazons-real-time-facial-recognition [npr.org]

          And further:

          'The Washington County Sheriff's Office says it does not use Rekognition in real time and doesn't intend to."

          So they're using it correctly - as an investigative tool, not something that attempts to positively ID people in realtime.

  • (Score: 0) by Anonymous Coward on Monday July 30 2018, @03:33AM

    by Anonymous Coward on Monday July 30 2018, @03:33AM (#714594)

    > When you enter search terms into an engine you don't expect to get back one exact match for what you were looking for.

    Uh, yes I do. That expectation is why it's so irritating when the search engines randomly remove words from my search to give me more irrelevant results to sift through.