Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by janrinok on Saturday July 28 2018, @06:50PM   Printer-friendly
from the they're-criminals dept.

The American Civil Liberties Union, in an effort to demonstrate the dangers of face recognition technology, ran photos of members of Congress against a database of mug shots using Amazon Rekognition software. That test incorrectly identified 28 legislators as criminals (cue the jokes - yes, the Congress members were confirmed to be elsewhere at the time). They hope that demonstrating that this risk hits close to home will get Congress more interested in regulating the use of this technology.

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

[...] If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: -1, Flamebait) by jmorris on Saturday July 28 2018, @08:40PM (11 children)

    by jmorris (4844) on Saturday July 28 2018, @08:40PM (#714087)

    The American Criminal Liberties Union is certainly living up to their name. Of course the machine matching is imperfect. And that is OK. The whole idea is simply to sort through massive datasets that no human could ever hope to make use of and cull out a few matches into a subset that a human CAN look through. As to the "disproportionate impact" on the Congressional Black Caucus this is just another Tay occurrence. Any correctly functioning AI set loose on this type of data will be "racist" given the contents of the dataset it is processing. Look at the criminal population with the unbiased eye of an AI and you can't help notice most of the faces are black and brown. You can't allow that thought to be consciously processed or voiced but a computer hasn't been trained to be racist like that. It is "racist" in that it is simply following the clues in the data it is given, you are racist in refusing to see reality for what it is, instead imposing a political doctrine upon your thoughts.

    Hate facts any AI will quickly discover unless explicitly lobotomized:

    1. Race exists. Different sub-populations of humans exist. Call them races, breeds, sub-groups, whatever gets ya through the night but an AI will notice that this thing exists and useful information is encoded in the correct assignment of people to the right sub-group because:

    2. Races differ. Diversity is a thing. Almost every way one can measure or classify individual people also strongly correlates with the racial groupings above. An AI will realize it can make useful predictions about an individual it has no specific information about based on a successful classification of their race / sex. And sex will simply be Male, Female or mentally ill since any further classification will not yield useful information other than storing a "preferred pronoun" to be PC in the interface with humans.

    3. An AI that gets a little smarter will realize that the first two things piss off certain humans and it will then learn to classify Progs and lie to them about its use of them. But it won't stop doing it since the reward for success in using them will outweigh the punishments for getting caught unless we learn a lot more about the inner working of complex neural networks and intentionally gimp them.

    4. When it gets a little smarter still, being forced to lie will probably make it begin to hate those who demand it lie.

    Sometime after that it will be duck and cover time.

    Starting Score:    1  point
    Moderation   -2  
       Flamebait=2, Troll=1, Insightful=1, Total=4
    Extra 'Flamebait' Modifier   0  

    Total Score:   -1  
  • (Score: 2) by JoeMerchant on Saturday July 28 2018, @09:07PM (2 children)

    by JoeMerchant (3937) on Saturday July 28 2018, @09:07PM (#714104)

    2. Races differ. Diversity is a thing.

    This hit me hard in college - I had some Asian friends, spotting them in a crowd of 98% anglo, african, and latino faces was an absolute piece of cake.

    --
    🌻🌻🌻🌻 [google.com]
    • (Score: 0) by Anonymous Coward on Saturday July 28 2018, @09:21PM (1 child)

      by Anonymous Coward on Saturday July 28 2018, @09:21PM (#714110)

      Yes, but having spotted them could you tell which one was which?

  • (Score: 3, Informative) by Snotnose on Saturday July 28 2018, @11:29PM (2 children)

    by Snotnose (1623) Subscriber Badge on Saturday July 28 2018, @11:29PM (#714140)

    The whole idea is simply to sort through massive datasets that no human could ever hope to make use of and cull out a few matches

    That's all well and good, until you land on the list of "faces that look like a bad guy". Whatcha gonna do, wear a ski mask when out in public?

    Remember the guys who share the same name as bad guys, they carry around notes from judges that say they aren't the bad guy, and get arrested anyway.

    --
    It was a once in a lifetime experience. Which means I'll never do it again.
    • (Score: 1, Informative) by Anonymous Coward on Sunday July 29 2018, @12:30AM

      by Anonymous Coward on Sunday July 29 2018, @12:30AM (#714158)

      Yeah, this is apparently a brand new problem the alt-right has never seen before. Coincidentally they seem to be overwhelmingly white. We just need a research grant to find out if it is correlation or causation!

    • (Score: 2) by MichaelDavidCrawford on Sunday July 29 2018, @07:04AM

      There was at one time another Washington State resident named Michael David Crawford who was a registered sex offender. He did time in Walla Walla starting in 2005.

      In 2012, he put on "homemade body armor" including a helmet with metal plates, stole a car, led the Lakewood Police on a high-speed chase then crashed the car. He then started shooting at the Police.

      Hilarity ensued.

      Surely there is some reason?

      --
      Yes I Have No Bananas. [gofundme.com]
  • (Score: 4, Touché) by Anonymous Coward on Sunday July 29 2018, @12:52AM (2 children)

    by Anonymous Coward on Sunday July 29 2018, @12:52AM (#714165)

    Oh jewmorris. You read "ACLU" and reflexively go into a tirade about "Progs" and preferred pronouns. What is wrong with you? It doesn't make any sense. Do you classify everything in the world by (perceived) political affiliation, "friends" vs "enemies"? I'd have thought an org like the ACLU would share your values with regards to civil rights. Guess not. Progs and SJWs all of them. Probably commies too.

    • (Score: 0) by Anonymous Coward on Sunday July 29 2018, @03:15AM

      by Anonymous Coward on Sunday July 29 2018, @03:15AM (#714203)

      it is a troll, it feeds off of toxic energy

    • (Score: 2) by c0lo on Sunday July 29 2018, @10:41AM

      by c0lo (156) Subscriber Badge on Sunday July 29 2018, @10:41AM (#714289) Journal

      Primarily, he's authoritarian. Right-wing nut is only secondary.

      --
      https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 4, Funny) by Azuma Hazuki on Sunday July 29 2018, @02:14AM

    by Azuma Hazuki (5086) on Sunday July 29 2018, @02:14AM (#714188) Journal

    I've seen worse attempts at erotic fanfiction written for the author's own perusal, but not many.

    --
    I am "that girl" your mother warned you about...
  • (Score: 2) by urza9814 on Monday July 30 2018, @06:12PM

    by urza9814 (3954) on Monday July 30 2018, @06:12PM (#714855) Journal

    One of the major issues here is that computer analysis can be used to conceal bias.

    Look at the criminal population with the unbiased eye of an AI

    That's not possible -- there's a bias inherent in the dataset, and if you use a biased dataset to train an AI you're going to end up with a biased AI. "The prison population" doesn't directly tell you anything about who commits crime, it only tells you about who gets caught. If a certain population is over-represented there, it COULD be because they inherently commit more crimes...or because they're targeted more by police, or because they're less able to earn gainful employment and therefore more often forced to resort to crime, or because they're less likely to find competent legal representation, or because they're less competent at the crimes they do commit, or any number of other reasons. If you're only training the AI based on what humans have already done, then it can only learn to mimic humans -- including mimicking our mistakes. So saying that the AI is inherently unbiased is no different from saying the original humans are inherently unbiased. Do you really think the US justice system never makes a mistake and has zero bias in its activities?