Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Friday July 17 2020, @02:40AM   Printer-friendly
from the if-you-have-nothing-to-hide-y—oh-wait dept.

Facial recognition linked to a second wrongful arrest by Detroit police:

A false facial recognition match has led to the arrest of another innocent person. According to the Detroit Free Press, police in the city arrested a man for allegedly reaching into a person's car, taking their phone and throwing it, breaking the case and damaging the screen in the process.

Facial recognition flagged Michael Oliver as a possible suspect, and the victim identified him in a photo lineup as the person who damaged their phone. Oliver was charged with a felony count of larceny over the May 2019 incident. He said he didn't commit the crime and the evidence supported his claim.

The perpetrator, who was recorded in footage captured on a phone, doesn't look like Oliver. For one thing, he has tattoos on his arms, and there aren't any visible on the person in the video. When Oliver's attorney took photos of him to the victim and an assistant prosecutor, they agreed Oliver had been misidentified. A judge later dismissed the case.

[...] Late last month, Detroit Police Chief James Craig suggested the technology the department uses, which was created by DataWorks Plus, isn't always reliable. "If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify," he said in a public meeting, according to Motherboard. From the start of the year through June 22nd, the force used the software 70 times per the department's public data. In all but two of those cases, the person whose image the technology analyzed was Black.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Friday July 17 2020, @06:17AM (1 child)

    by Anonymous Coward on Friday July 17 2020, @06:17AM (#1022797)

    "From the start of the year through June 22nd, the force used the software 70 times per the department's public data. In all but two of those cases, the person whose image the technology analyzed was Black."

    As of the 2010 census, Detroit is about 83% [wikipedia.org] black and that number has been skyrocketing as everybody else has been leaving the city for decades. E.g. from 2000 to 2010 the total city population dropped 25% while the black population increased from 76% to 83%. 2 out of 70 seems hardly meaningful and likely included, without context, only to mislead those who are not aware of the demographics of Detroit.

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 5, Insightful) by FatPhil on Friday July 17 2020, @11:01AM

    by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Friday July 17 2020, @11:01AM (#1022838) Homepage
    It has been proven, time and time again (the last paper on it was on SN in the last month), that facial recognition software is highly unreliable on darker skin tones.

    This story isn't about one case. You're not seeing the bigger picture. This is a story abot wrongthink.

    Relying on a tool that is known to work least well on the population that you have in front of you is an extra stubborn type of wrongthink. Mentioning that population is useful context for evaluating the wrongthink.

    We're all sorry it triggered you, would you like a hug?
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves