Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday July 28 2018, @06:50PM   Printer-friendly
from the they're-criminals dept.

The American Civil Liberties Union, in an effort to demonstrate the dangers of face recognition technology, ran photos of members of Congress against a database of mug shots using Amazon Rekognition software. That test incorrectly identified 28 legislators as criminals (cue the jokes - yes, the Congress members were confirmed to be elsewhere at the time). They hope that demonstrating that this risk hits close to home will get Congress more interested in regulating the use of this technology.

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

[...] If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by The Shire on Sunday July 29 2018, @01:12PM (2 children)

    by The Shire (5824) on Sunday July 29 2018, @01:12PM (#714338)

    If the ACLU was only interested in accuracy they would have properly portrayed that this tool is only used to narrow the fiekd, not as a way to ID a criminal. It implies strongly they are being intentionally deceptive to further their own agenda. I dont think my comment spins that at all.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Insightful) by number11 on Sunday July 29 2018, @07:56PM (1 child)

    by number11 (1170) Subscriber Badge on Sunday July 29 2018, @07:56PM (#714442)

    If the ACLU was only interested in accuracy they would have properly portrayed that this tool is only used to narrow the fiekd, not as a way to ID a criminal. It implies strongly they are being intentionally deceptive to further their own agenda.

    As does the approach of thinking that no policeman will ever act as if it is a way to ID a criminal. The same way that policemen believe that drug dogs are always right, and that bargain-basement field tests for drugs are accurate.

    The false positive rate is 5%. That means that one out of every 20 people passing the cop will be treated as a criminal. And it will probably happen repeatedly to the same 5%.

    • (Score: 2) by The Shire on Monday July 30 2018, @01:24AM

      by The Shire (5824) on Monday July 30 2018, @01:24AM (#714558)

      When the confidence of a neural net is set to 80% I can promise you the false positive rate is MUCH higher than 5%. Again, they're using this tool incorrectly. It's a filter not a pointer. It returns a large set of possible matches that a human must then sift through and decide which if any is an actual match. No one out there is using this for traffic stops expecting it to be a positive ID. The ACLU is making claims that a system is failing to do what it was never designed or intended to do.

      If you give me a hammer and I then run around smashing car windows, you don't blame the hammer for being destructive, you blame the user for using the tool incorrectly. That's the case here as well. This software is not intended to positively ID anyone, nor is it offered as something that does so.