Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Wednesday December 20 2017, @10:40PM   Printer-friendly
from the if-!white-then-kick() dept.

The New York City Council has unanimously passed a bill to address algorithmic discrimination by city agencies. If signed by Mayor Bill de Blasio, New York City will establish a task force to study how city agencies use data and algorithms to make decisions, and whether systems appear to discriminate against certain groups:

The bill's sponsor, Council Member James Vacca, said he was inspired by ProPublica's investigation into racially biased algorithms used to assess the criminal risk of defendants. "My ambition here is transparency, as well as accountability," Vacca said.

A previous, more sweeping version of the bill had mandated that city agencies publish the source code of all algorithms being used for "targeting services" or "imposing penalties upon persons or policing" and to make them available for "self-testing" by the public. At a hearing at City Hall in October, representatives from the mayor's office expressed concerns that this mandate would threaten New Yorkers' privacy and the government's cybersecurity.

The bill was one of two moves the City Council made last week concerning algorithms. On Thursday, the committees on health and public safety held a hearing on the city's forensic methods, including controversial tools that the chief medical examiner's office crime lab has used for difficult-to-analyze samples of DNA. As a ProPublica/New York Times investigation detailed in September, an algorithm created by the lab for complex DNA samples has been called into question by scientific experts and former crime lab employees.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by looorg on Wednesday December 20 2017, @11:50PM (1 child)

    by looorg (578) on Wednesday December 20 2017, @11:50PM (#612618)

    Just out of curiosity how do you make decisions (algorithmically or not) without "discriminating" against someone or some group? It's all about making choices and selections. So someone is going to be left out for failing one or missing one variable or another.

    Invented by Caragine and Adele Mitchell, a geneticist with a specialty in statistics who joined the lab in 2008, the Forensic Statistical Tool, or FST, considers the overall amount of DNA in the mixture, how many people are in it, how much information is probably missing or contaminated, and the frequency with which each piece of DNA appears in different racial or ethnic groups. Then it compares the defendant’s DNA profile to the mixture, and calculates a likelihood ratio, which it expresses as a single number.

    So from the article it seems they are making selections based on DNA and statistics to weed out potential subjects to test versus. While the margin of error seems to be upwards to 30% there is also this matter of fact that a lot of it seems to come down to how people react to it. OMG IT'S DNA SO HE MUST BE THE MURDERER! And then people plead guilty. On the other hand nobody seems overly concerned with that it could also have selected away people that was actually guilty. Also they seem to confuse people with larger numbers making them seem more important or impressive then they are. Oh so it's like 55 million, that seems pretty bad -- unless you know don't have a frame of reference.

    ... asked for the FST source code, which other lawyers had sought in vain. Again, the government refused to hand it over on the grounds that it was a “proprietary and copyrighted” statistical tool owned by the City of New York.

    Well if you are going to use statistics as you base it would be appropriate if you at least share your model or the recipe (or algorithm) that you use. After all any idiot can fudge up an index of some kind to show pretty much whatever they like.

    He found that the program dropped valuable data from its calculations, in ways that users wouldn’t necessarily be aware of, but that could unpredictably affect the likelihood assigned to the defendant’s DNA being in the mixture. “I did not leave with the impression that FST was developed by an experienced software development team,”

    Auch! Still if someone (above mentioned people) developed the method didn't they find it odd if their method and the software didn't product expected or similar results? Seems a bit like they are trying to find someone to toss under the bus or fall on the sword etc etc.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by frojack on Thursday December 21 2017, @12:07AM

    by frojack (1554) on Thursday December 21 2017, @12:07AM (#612631) Journal

    Just out of curiosity how do you make decisions (algorithmically or not) without "discriminating" against someone or some group? It's all about making choices and selections. So someone is going to be left out for failing one or missing one variable or another.

    While that may be true, the important part is that the algorithm of the software or the programmer should not be substituting its/their discrimination for the discrimination that is agreed upon by society (which presumably is encoded into law).

    DNA eliminates many billions of subjects long before it gets down to selecting a small hand full of valid subjects. In medical situations this saves millions of lives. In criminal situations DNA is still not considered proof in the face of incontrovertible evidence to the contrary. Planting of evidence is a thing, and the courts recognize this.

    --
    No, you are mistaken. I've always had this sig.