The American Civil Liberties Union, in an effort to demonstrate the dangers of face recognition technology, ran photos of members of Congress against a database of mug shots using Amazon Rekognition software. That test incorrectly identified 28 legislators as criminals (cue the jokes - yes, the Congress members were confirmed to be elsewhere at the time). They hope that demonstrating that this risk hits close to home will get Congress more interested in regulating the use of this technology.
The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.
[...] If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.
(Score: 3, Informative) by c0lo on Sunday July 29 2018, @10:48AM (6 children)
Actually "80% accuracy" setting is the Amazon's default. Wanna bet your average doughnut munching policeman won't use other settings? TFA
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by The Shire on Sunday July 29 2018, @01:08PM (5 children)
You're missing the point of my post: it's a filter. Even wt 80% it generates a narrowed list of subjects to investigate, it doesnt point the finger at one person. This is a tool not a jury. The ACLU is deliberately misleading the public about how this is used.
(Score: 2, Insightful) by Anonymous Coward on Sunday July 29 2018, @01:52PM (1 child)
The people with the most skin in the game to deliberately mislead the public about how this will be used are Amazon and law enforcement. Unless we keep a close watch on them, this will be used as a pointer not a filter.
(Score: 1, Insightful) by Anonymous Coward on Sunday July 29 2018, @03:41PM
Yup! And it wont stop innocent people from being harassed or worse.
(Score: 3, Insightful) by c0lo on Sunday July 29 2018, @09:30PM (2 children)
it will be used as a pointer.
Step into the policeman shoes, with limited (objectively or subjectively) effort capacity:
- without the tech, he needs to search for other clues first and apply the "looks like" filter later
- with the tech, he'll get a list of persons "identified" as possible suspects - he'll very likely started to work with this list,
Really? Is it already used?
If not, how can you be so sure?
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by HiThere on Sunday July 29 2018, @11:42PM
FWIW, it *is* already used. (Possibly not exactly this software.) I don't know that it's used by police, but I believe that it was yesterday or the day before that there was news on Soylent that two Canadian Malls were using it, and that the owners of those malls claimed that "others were using it". It's true they also claimed that they didn't track the data that would allow individuals to be identified. Believe them if you want to, it might be true.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by The Shire on Monday July 30 2018, @01:27AM
Yes, really, it's already being used:
https://www.npr.org/2018/05/22/613115969/orlando-police-testing-amazons-real-time-facial-recognition [npr.org]
And further:
'The Washington County Sheriff's Office says it does not use Rekognition in real time and doesn't intend to."
So they're using it correctly - as an investigative tool, not something that attempts to positively ID people in realtime.