Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday July 28 2018, @06:50PM   Printer-friendly
from the they're-criminals dept.

The American Civil Liberties Union, in an effort to demonstrate the dangers of face recognition technology, ran photos of members of Congress against a database of mug shots using Amazon Rekognition software. That test incorrectly identified 28 legislators as criminals (cue the jokes - yes, the Congress members were confirmed to be elsewhere at the time). They hope that demonstrating that this risk hits close to home will get Congress more interested in regulating the use of this technology.

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

[...] If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Informative) by Anonymous Coward on Saturday July 28 2018, @07:09PM (3 children)

    by Anonymous Coward on Saturday July 28 2018, @07:09PM (#714057)

    https://news.ycombinator.com/item?id=17617210 [ycombinator.com]

    Instead of arguing about it, watch some real experts tear this to shreds.

    • (Score: -1, Offtopic) by Anonymous Coward on Saturday July 28 2018, @07:33PM (2 children)

      by Anonymous Coward on Saturday July 28 2018, @07:33PM (#714068)

      You dare link to "Hacker" News here? Do you have any idea where you are? This is Soylent, News for Boomers. SoylentNews is old people.

      Get the fuck off my lawn, kid!

      • (Score: 0, Troll) by jmorris on Saturday July 28 2018, @08:43PM (1 child)

        by jmorris (4844) on Saturday July 28 2018, @08:43PM (#714091)

        Ya do have a point. We have a lot of old school types who still believe in the old ways of the Internet. Who believe free speech is more important that preserving the feelz of mentally unstable snowflakes. Who think if the snowflakes would just go ahead and off themselves the rest of us could get on with our day in peace.

        • (Score: 1, Informative) by Anonymous Coward on Sunday July 29 2018, @12:27AM

          by Anonymous Coward on Sunday July 29 2018, @12:27AM (#714156)

          It has become quite clear that you alt-right incels are way more sensitive, but being "alphas" you are not even capable of realizing that you start yammering bullshit when your ideologies come under attack by facts. The actual majority of the US wishes you'd move back to Cucktanistan so we can go about living without getting bombarded by bigoted rhetoric. We'll throw in the POTUS so you have a, uh, functioning government to start off your new nation.

  • (Score: 1, Interesting) by Anonymous Coward on Saturday July 28 2018, @07:37PM (2 children)

    by Anonymous Coward on Saturday July 28 2018, @07:37PM (#714069)

    Unfortunately, people aren't taking this seriously, for example see the direct quote at the end of this article:

    https://buffalonews.com/2018/07/27/tom-reed-wanted-man-amazons-rekognition-falsely-thought-so/ [buffalonews.com]

    People who know the jovial, friendly-to-a-fault Rep. Tom Reed would never think that he would go by the nickname "Penitentiary Face."

    But if you believe Amazon's facial recognition software, that might be a fitting moniker.

    The American Civil Liberties Union recently did a test of Amazon's controversial facial recognition software, called "Rekognition," where head shots of members of Congress were cross-referenced with a criminal mugshot database.

    The result? "Rekognition" falsely recognized 28 members of Congress as hardened criminals, including Reed, three U.S. senators and Rep. John Lewis, the civil rights hero from Georgia.

    The test disproportionately singled out African-American and Latino lawmakers as criminals, even though they're not. And to the ACLU, this stands as proof that law enforcement agencies shouldn't use Amazon's flawed technology.

    But Reed, a Republican from Corning with little hair, reacted to his case of mistaken identity with characteristic good humor.

    “I just hope they were kind enough to match me to someone with hair," he said.

    • (Score: 2) by SpockLogic on Saturday July 28 2018, @07:50PM (1 child)

      by SpockLogic (2762) on Saturday July 28 2018, @07:50PM (#714076)

      Only 28 members of Congress? I thought it would be much higher, much much higher.

      They were falsely recognized you say. No, they must have all been guilty of something. I call fake news.

      --
      Overreacting is one thing, sticking your head up your ass hoping the problem goes away is another - edIII
      • (Score: 0, Informative) by Anonymous Coward on Saturday July 28 2018, @09:19PM

        by Anonymous Coward on Saturday July 28 2018, @09:19PM (#714109)

        Many of them were black, does that help?

  • (Score: -1, Flamebait) by jmorris on Saturday July 28 2018, @08:40PM (11 children)

    by jmorris (4844) on Saturday July 28 2018, @08:40PM (#714087)

    The American Criminal Liberties Union is certainly living up to their name. Of course the machine matching is imperfect. And that is OK. The whole idea is simply to sort through massive datasets that no human could ever hope to make use of and cull out a few matches into a subset that a human CAN look through. As to the "disproportionate impact" on the Congressional Black Caucus this is just another Tay occurrence. Any correctly functioning AI set loose on this type of data will be "racist" given the contents of the dataset it is processing. Look at the criminal population with the unbiased eye of an AI and you can't help notice most of the faces are black and brown. You can't allow that thought to be consciously processed or voiced but a computer hasn't been trained to be racist like that. It is "racist" in that it is simply following the clues in the data it is given, you are racist in refusing to see reality for what it is, instead imposing a political doctrine upon your thoughts.

    Hate facts any AI will quickly discover unless explicitly lobotomized:

    1. Race exists. Different sub-populations of humans exist. Call them races, breeds, sub-groups, whatever gets ya through the night but an AI will notice that this thing exists and useful information is encoded in the correct assignment of people to the right sub-group because:

    2. Races differ. Diversity is a thing. Almost every way one can measure or classify individual people also strongly correlates with the racial groupings above. An AI will realize it can make useful predictions about an individual it has no specific information about based on a successful classification of their race / sex. And sex will simply be Male, Female or mentally ill since any further classification will not yield useful information other than storing a "preferred pronoun" to be PC in the interface with humans.

    3. An AI that gets a little smarter will realize that the first two things piss off certain humans and it will then learn to classify Progs and lie to them about its use of them. But it won't stop doing it since the reward for success in using them will outweigh the punishments for getting caught unless we learn a lot more about the inner working of complex neural networks and intentionally gimp them.

    4. When it gets a little smarter still, being forced to lie will probably make it begin to hate those who demand it lie.

    Sometime after that it will be duck and cover time.

    • (Score: 2) by JoeMerchant on Saturday July 28 2018, @09:07PM (2 children)

      by JoeMerchant (3937) on Saturday July 28 2018, @09:07PM (#714104)

      2. Races differ. Diversity is a thing.

      This hit me hard in college - I had some Asian friends, spotting them in a crowd of 98% anglo, african, and latino faces was an absolute piece of cake.

      --
      🌻🌻 [google.com]
      • (Score: 0) by Anonymous Coward on Saturday July 28 2018, @09:21PM (1 child)

        by Anonymous Coward on Saturday July 28 2018, @09:21PM (#714110)

        Yes, but having spotted them could you tell which one was which?

    • (Score: 3, Informative) by Snotnose on Saturday July 28 2018, @11:29PM (2 children)

      by Snotnose (1623) on Saturday July 28 2018, @11:29PM (#714140)

      The whole idea is simply to sort through massive datasets that no human could ever hope to make use of and cull out a few matches

      That's all well and good, until you land on the list of "faces that look like a bad guy". Whatcha gonna do, wear a ski mask when out in public?

      Remember the guys who share the same name as bad guys, they carry around notes from judges that say they aren't the bad guy, and get arrested anyway.

      --
      When the dust settled America realized it was saved by a porn star.
      • (Score: 1, Informative) by Anonymous Coward on Sunday July 29 2018, @12:30AM

        by Anonymous Coward on Sunday July 29 2018, @12:30AM (#714158)

        Yeah, this is apparently a brand new problem the alt-right has never seen before. Coincidentally they seem to be overwhelmingly white. We just need a research grant to find out if it is correlation or causation!

      • (Score: 2) by MichaelDavidCrawford on Sunday July 29 2018, @07:04AM

        There was at one time another Washington State resident named Michael David Crawford who was a registered sex offender. He did time in Walla Walla starting in 2005.

        In 2012, he put on "homemade body armor" including a helmet with metal plates, stole a car, led the Lakewood Police on a high-speed chase then crashed the car. He then started shooting at the Police.

        Hilarity ensued.

        Surely there is some reason?

        --
        Yes I Have No Bananas. [gofundme.com]
    • (Score: 4, Touché) by Anonymous Coward on Sunday July 29 2018, @12:52AM (2 children)

      by Anonymous Coward on Sunday July 29 2018, @12:52AM (#714165)

      Oh jewmorris. You read "ACLU" and reflexively go into a tirade about "Progs" and preferred pronouns. What is wrong with you? It doesn't make any sense. Do you classify everything in the world by (perceived) political affiliation, "friends" vs "enemies"? I'd have thought an org like the ACLU would share your values with regards to civil rights. Guess not. Progs and SJWs all of them. Probably commies too.

      • (Score: 0) by Anonymous Coward on Sunday July 29 2018, @03:15AM

        by Anonymous Coward on Sunday July 29 2018, @03:15AM (#714203)

        it is a troll, it feeds off of toxic energy

      • (Score: 2) by c0lo on Sunday July 29 2018, @10:41AM

        by c0lo (156) Subscriber Badge on Sunday July 29 2018, @10:41AM (#714289) Journal

        Primarily, he's authoritarian. Right-wing nut is only secondary.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 4, Funny) by Azuma Hazuki on Sunday July 29 2018, @02:14AM

      by Azuma Hazuki (5086) on Sunday July 29 2018, @02:14AM (#714188) Journal

      I've seen worse attempts at erotic fanfiction written for the author's own perusal, but not many.

      --
      I am "that girl" your mother warned you about...
    • (Score: 2) by urza9814 on Monday July 30 2018, @06:12PM

      by urza9814 (3954) on Monday July 30 2018, @06:12PM (#714855) Journal

      One of the major issues here is that computer analysis can be used to conceal bias.

      Look at the criminal population with the unbiased eye of an AI

      That's not possible -- there's a bias inherent in the dataset, and if you use a biased dataset to train an AI you're going to end up with a biased AI. "The prison population" doesn't directly tell you anything about who commits crime, it only tells you about who gets caught. If a certain population is over-represented there, it COULD be because they inherently commit more crimes...or because they're targeted more by police, or because they're less able to earn gainful employment and therefore more often forced to resort to crime, or because they're less likely to find competent legal representation, or because they're less competent at the crimes they do commit, or any number of other reasons. If you're only training the AI based on what humans have already done, then it can only learn to mimic humans -- including mimicking our mistakes. So saying that the AI is inherently unbiased is no different from saying the original humans are inherently unbiased. Do you really think the US justice system never makes a mistake and has zero bias in its activities?

  • (Score: 5, Interesting) by JoeMerchant on Saturday July 28 2018, @09:04PM

    by JoeMerchant (3937) on Saturday July 28 2018, @09:04PM (#714102)

    10 years ago I had to give a PhD a lecture about how fallible his PC based fingerprint reader was and that, while we could use it as a convenience, there had to be a backup method of identification.

    Seems that this self styled "smartest guy in the room" had never experienced an inability to login with his PC fingerprint reader (happened about 10% of the time to about 30% of the population back then), so he believed what he saw on TV and movies that fingerprints were unique and infallible. They're good - when they're processed by experts they're very good, but with ~140 billion fingerprint patterns, just in the live human populations, and real world challenges of partial prints and fuzzy features they are not a unique, perfect identification tool.

    --
    🌻🌻 [google.com]
  • (Score: 3, Insightful) by Captival on Saturday July 28 2018, @09:16PM (1 child)

    by Captival (6866) on Saturday July 28 2018, @09:16PM (#714108)

    I think the thing undercounted. I'm sure WAY more than than 5% of Congress are criminals.

    • (Score: 2) by urza9814 on Monday July 30 2018, @06:16PM

      by urza9814 (3954) on Monday July 30 2018, @06:16PM (#714857) Journal

      Yeah, but they commit the kind of crimes that don't get caught, therefore an AI trained based on current prisoners shouldn't detect them.

      The thing about AI is that what it does is different from what it's marketed as -- ie, "detecting criminals" vs "detecting prisoners". These are not actually the same task.

  • (Score: 5, Insightful) by digitalaudiorock on Sunday July 29 2018, @01:24AM

    by digitalaudiorock (688) on Sunday July 29 2018, @01:24AM (#714177) Journal

    How about how all states (that I'm aware of) now require your drivers license photo to be a) without glasses, b) a neutral expression, and c) facing straight forward. No question why that is...so we can all be subject to this unwarranted search at the whim of law enforcement. Nice to know how "reliable" it is. OBL really was an evil genius using a handful of guys with box-cutters to get us to give up all our freedom.

  • (Score: 2) by The Shire on Sunday July 29 2018, @04:57AM (12 children)

    by The Shire (5824) on Sunday July 29 2018, @04:57AM (#714238)

    People are deliberatly conflating the concept of a filter with an ID.

    When you enter search terms into an engine you don't expect to get back one exact match for what you were looking for. You expect to get a list of narrowed down results which you then parse using your own judgement. Facial recognition is the same way. It's not intended to "finger the bad guy". It's intended to narrow the field to reduce the workload on the investigator who then uses their judgement to decide who in the returned list, if anyone, is who they were looking for.

    And it's deceptive to use such a small sample size to train the system on and then select an 80% confidence level and expect to get a precise result. 95% confidence would be more appropriate and I have no doubt they tried that and didn't get the result they wanted so they reduced it until they got the false positives they wanted for a headline.

    • (Score: 4, Interesting) by janrinok on Sunday July 29 2018, @07:09AM (3 children)

      by janrinok (52) Subscriber Badge on Sunday July 29 2018, @07:09AM (#714253) Journal

      I have no doubt they tried that and didn't get the result they wanted

      And you have no doubt that they did this because you have evidence of it, or simply because it fits your own point of view? It might have happened, but there is nothing to support that speculative statement. You appear to be just as guilty at spinning a story in the same way that you are accusing the ACLU of doing.

      • (Score: 2) by The Shire on Sunday July 29 2018, @01:12PM (2 children)

        by The Shire (5824) on Sunday July 29 2018, @01:12PM (#714338)

        If the ACLU was only interested in accuracy they would have properly portrayed that this tool is only used to narrow the fiekd, not as a way to ID a criminal. It implies strongly they are being intentionally deceptive to further their own agenda. I dont think my comment spins that at all.

        • (Score: 3, Insightful) by number11 on Sunday July 29 2018, @07:56PM (1 child)

          by number11 (1170) Subscriber Badge on Sunday July 29 2018, @07:56PM (#714442)

          If the ACLU was only interested in accuracy they would have properly portrayed that this tool is only used to narrow the fiekd, not as a way to ID a criminal. It implies strongly they are being intentionally deceptive to further their own agenda.

          As does the approach of thinking that no policeman will ever act as if it is a way to ID a criminal. The same way that policemen believe that drug dogs are always right, and that bargain-basement field tests for drugs are accurate.

          The false positive rate is 5%. That means that one out of every 20 people passing the cop will be treated as a criminal. And it will probably happen repeatedly to the same 5%.

          • (Score: 2) by The Shire on Monday July 30 2018, @01:24AM

            by The Shire (5824) on Monday July 30 2018, @01:24AM (#714558)

            When the confidence of a neural net is set to 80% I can promise you the false positive rate is MUCH higher than 5%. Again, they're using this tool incorrectly. It's a filter not a pointer. It returns a large set of possible matches that a human must then sift through and decide which if any is an actual match. No one out there is using this for traffic stops expecting it to be a positive ID. The ACLU is making claims that a system is failing to do what it was never designed or intended to do.

            If you give me a hammer and I then run around smashing car windows, you don't blame the hammer for being destructive, you blame the user for using the tool incorrectly. That's the case here as well. This software is not intended to positively ID anyone, nor is it offered as something that does so.

    • (Score: 3, Informative) by c0lo on Sunday July 29 2018, @10:48AM (6 children)

      by c0lo (156) Subscriber Badge on Sunday July 29 2018, @10:48AM (#714293) Journal

      And it's deceptive to use such a small sample size to train the system on and then select an 80% confidence level and expect to get a precise result.

      Actually "80% accuracy" setting is the Amazon's default. Wanna bet your average doughnut munching policeman won't use other settings? TFA

      Using Rekognition, we built a face database and search tool using 25,000 publicly available arrest photos. Then we searched that database against public photos of every current member of the House and Senate. We used the default match settings that Amazon sets for Rekognition.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by The Shire on Sunday July 29 2018, @01:08PM (5 children)

        by The Shire (5824) on Sunday July 29 2018, @01:08PM (#714335)

        You're missing the point of my post: it's a filter. Even wt 80% it generates a narrowed list of subjects to investigate, it doesnt point the finger at one person. This is a tool not a jury. The ACLU is deliberately misleading the public about how this is used.

        • (Score: 2, Insightful) by Anonymous Coward on Sunday July 29 2018, @01:52PM (1 child)

          by Anonymous Coward on Sunday July 29 2018, @01:52PM (#714347)

          The people with the most skin in the game to deliberately mislead the public about how this will be used are Amazon and law enforcement. Unless we keep a close watch on them, this will be used as a pointer not a filter.

          • (Score: 1, Insightful) by Anonymous Coward on Sunday July 29 2018, @03:41PM

            by Anonymous Coward on Sunday July 29 2018, @03:41PM (#714377)

            Yup! And it wont stop innocent people from being harassed or worse.

        • (Score: 3, Insightful) by c0lo on Sunday July 29 2018, @09:30PM (2 children)

          by c0lo (156) Subscriber Badge on Sunday July 29 2018, @09:30PM (#714465) Journal

          You're missing the point of my post: it's a filter.

          it will be used as a pointer.
          Step into the policeman shoes, with limited (objectively or subjectively) effort capacity:
          - without the tech, he needs to search for other clues first and apply the "looks like" filter later
          - with the tech, he'll get a list of persons "identified" as possible suspects - he'll very likely started to work with this list,

          The ACLU is deliberately misleading the public about how this is used.

          Really? Is it already used?
          If not, how can you be so sure?

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
          • (Score: 2) by HiThere on Sunday July 29 2018, @11:42PM

            by HiThere (866) Subscriber Badge on Sunday July 29 2018, @11:42PM (#714507) Journal

            FWIW, it *is* already used. (Possibly not exactly this software.) I don't know that it's used by police, but I believe that it was yesterday or the day before that there was news on Soylent that two Canadian Malls were using it, and that the owners of those malls claimed that "others were using it". It's true they also claimed that they didn't track the data that would allow individuals to be identified. Believe them if you want to, it might be true.

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
          • (Score: 2) by The Shire on Monday July 30 2018, @01:27AM

            by The Shire (5824) on Monday July 30 2018, @01:27AM (#714560)

            Yes, really, it's already being used:

            https://www.npr.org/2018/05/22/613115969/orlando-police-testing-amazons-real-time-facial-recognition [npr.org]

            And further:

            'The Washington County Sheriff's Office says it does not use Rekognition in real time and doesn't intend to."

            So they're using it correctly - as an investigative tool, not something that attempts to positively ID people in realtime.

    • (Score: 0) by Anonymous Coward on Monday July 30 2018, @03:33AM

      by Anonymous Coward on Monday July 30 2018, @03:33AM (#714594)

      > When you enter search terms into an engine you don't expect to get back one exact match for what you were looking for.

      Uh, yes I do. That expectation is why it's so irritating when the search engines randomly remove words from my search to give me more irrelevant results to sift through.

(1)