Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday August 02 2019, @03:56PM   Printer-friendly
from the think-of-the-children! dept.

Submitted via IRC for Bytram

She Was Arrested at 14. Then Her Photo Went to a Facial Recognition Database.

The New York Police Department has been loading thousands of arrest photos of children and teenagers into a facial recognition database despite evidence the technology has a higher risk of false matches in younger faces.

For about four years, internal records show, the department has used the technology to compare crime scene images with its collection of juvenile mug shots, the photos that are taken at an arrest. Most of the photos are of teenagers, largely 13 to 16 years old, but children as young as 11 have been included.

Elected officials and civil rights groups said the disclosure that the city was deploying a powerful surveillance tool on adolescents — whose privacy seems sacrosanct and whose status is protected in the criminal justice system — was a striking example of the Police Department's ability to adopt advancing technology with little public scrutiny.

Several members of the City Council as well as a range of civil liberties groups said they were unaware of the policy until they were contacted by The New York Times.

Police Department officials defended the decision, saying it was just the latest evolution of a longstanding policing technique: using arrest photos to identify suspects.

"I don't think this is any secret decision that's made behind closed doors," the city's chief of detectives, Dermot F. Shea, said in an interview. "This is just process, and making sure we're doing everything to fight crime."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Saturday August 03 2019, @04:31PM (4 children)

    by Anonymous Coward on Saturday August 03 2019, @04:31PM (#875181)

    Can't help wondering how facial recognition in Asian countries, especially China, compares to facial recognition in the western world. When mostly non-Caucasians are working on the software, surely the software inherits different assumptions and presumptions?

    Thanks for the non-insightful racism. The assumptions is you have a nose, ears, mouth and eyes. The software just needs to identify positions of those. So you can see, the differences has not so much to do with whether someone is Caucasian or non-Caucasian.

  • (Score: 2) by Runaway1956 on Sunday August 04 2019, @12:32AM (3 children)

    by Runaway1956 (2926) Subscriber Badge on Sunday August 04 2019, @12:32AM (#875306) Journal

    Racism. It's always racism, when a statement, or observation doesn't match your world view.

    • (Score: 1, Troll) by ikanreed on Monday August 05 2019, @05:50PM (2 children)

      by ikanreed (3164) Subscriber Badge on Monday August 05 2019, @05:50PM (#876097) Journal

      Buddy, you personally are a dumb, racist fuck. "It's always racism" because you're constantly being racist. Got enough history there to know it. So you're just complaining about people responding reasonably to your behavior.

      • (Score: 3, Informative) by takyon on Monday August 05 2019, @06:34PM (1 child)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday August 05 2019, @06:34PM (#876126) Journal

        Can't help wondering how facial recognition in Asian countries, especially China, compares to facial recognition in the western world. When mostly non-Caucasians are working on the software, surely the software inherits different assumptions and presumptions?

        Both you and the AC must have missed stories like these:

        Facial Recognition Is Accurate, if You’re a White Guy [nytimes.com]
        IBM Releases "Diversity in Faces" Dataset for Facial Recognition Systems [soylentnews.org]

        Until recently, facial recognition software has been great at identifying white faces, but not black ones, due to deficiencies in the data sets and the biases of the programmers.

        Ironically, the called-for improvements will make it easier for police mass surveillance to be used as a tool against blacks.

        I doubt China will have any problem with their systems. They are already [soylentnews.org] using it [soylentnews.org], and Westerners will stick out like a sore thumb anyway.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 0) by Anonymous Coward on Monday August 05 2019, @11:23PM

          by Anonymous Coward on Monday August 05 2019, @11:23PM (#876229)

          In modern artificial intelligence, data rules. A.I. software is only as smart as the data used to train it. If there are many more white men than black women in the system, it will be worse at identifying the black women.

          Bingo. That is exactly one of the bits of data I was fishing for.