Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 12 submissions in the queue.
posted by Fnord666 on Saturday March 21 2020, @11:57PM   Printer-friendly
from the the-eyes-are-the-window dept.

Arthur T Knackerbracket has found the following story:

Hanwang, the facial-recognition company that has placed 2 million of its cameras at entrance gates across the world, started preparing for the coronavirus in early January.

Huang Lei, the company’s chief technical officer, said that even before the new virus was widely known about, he had begun to get requests from hospitals at the centre of the outbreak in Hubei province to update its software to recognise nurses wearing masks.

[...] If three or five clients ask for the same thing . . . we’ll see that as important,” said Mr Huang, adding that its cameras previously only recognised people in masks half the time, compared with 99.5 percent accuracy for a full face image.

[...] The company now says its masked facial recognition program has reached 95 percent accuracy in lab tests, and even claims that it is more accurate in real life, where its cameras take multiple photos of a person if the first attempt to identify them fails.

“The problem of masked facial recognition is not new, but belongs to the family of facial recognition with occlusion,” Mr Huang said, adding that his company had first encountered similar issues with people with beards in Turkey and Pakistan, as well as with northern Chinese customers wearing winter clothing that covered their ears and face.

Counter-intuitively, training facial recognition algorithms to recognize masked faces involves throwing data away. A team at the University of Bradford published a study last year showing they could train a facial recognition program to accurately recognize half-faces by deleting parts of the photos they used to train the software.


Original Submission

Related Stories

AWS Facial Recognition Platform Misidentified Over 100 Politicians as Criminals 26 comments

AWS Facial Recognition Platform Misidentified Over 100 Politicians As Criminals:

Comparitech's Paul Bischoff found that Amazon's facial recognition platform misidentified an alarming number of people, and was racially biased.

Facial recognition technology is still misidentifying people at an alarming rate – even as it's being used by police departments to make arrests. In fact, Paul Bischoff, consumer privacy expert with Comparitech, found that Amazon's face recognition platform incorrectly misidentified more than 100 photos of US and UK lawmakers as criminals.

Rekognition, Amazon's cloud-based facial recognition platform that was first launched in 2016, has been sold and used by a number of United States government agencies, including ICE and Orlando, Florida police, as well as private entities. In comparing photos of a total of 1,959 US and UK lawmakers to subjects in an arrest database, Bischoff found that Rekognition misidentified at average of 32 members of Congress. That's four more than a similar experiment conducted by the American Civil Liberties Union (ACLU) – two years ago. Bischoff also found that the platform was racially biased, misidentifying non-white people at a higher rate than white people.

These findings have disturbing real-life implications. Last week, the ACLU shed light on Detroit citizen Robert Julian-Borchak Williams, who was arrested after a facial recognition system falsely matched his photo with security footage of a shoplifter.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Sunday March 22 2020, @12:19AM

    by Anonymous Coward on Sunday March 22 2020, @12:19AM (#973974)

    They all look the same and no one complains about false positives after they've been executed.

  • (Score: 2, Insightful) by Anonymous Coward on Sunday March 22 2020, @01:38AM (7 children)

    by Anonymous Coward on Sunday March 22 2020, @01:38AM (#973992)

    Even more effective dystopian mass surveillance! Let's write article after article about how amazing this is, and only occasionally allude to the inevitable abuses and destruction of democracy.

    • (Score: 2) by Rosco P. Coltrane on Sunday March 22 2020, @10:50AM (1 child)

      by Rosco P. Coltrane (4757) on Sunday March 22 2020, @10:50AM (#974099)

      Well technically it's amazing. Many stuffs of nightmares are - like nuclear bombs.

      • (Score: 0) by Anonymous Coward on Sunday March 22 2020, @11:13AM

        by Anonymous Coward on Sunday March 22 2020, @11:13AM (#974102)

        Nukes Detonated at high altitude are perfect for disabling these sorts of systems, and have the added bonus of vacating the location for you to stroll it in peace :)

        #gammapower #emppulse

    • (Score: 0) by Anonymous Coward on Sunday March 22 2020, @11:54AM (2 children)

      by Anonymous Coward on Sunday March 22 2020, @11:54AM (#974107)

      Even more effective dystopian mass surveillance! Let's write article after article about how amazing this is, and only occasionally allude to the inevitable abuses and destruction of democracy.

      It's not what it is, it's how you use it.

      A knife can make you a nice meal or kill you. A face recognition software can detect and identify disease spread vectors or it can be used to track political opponents. It's how you use it.

      • (Score: 0) by Anonymous Coward on Monday March 23 2020, @01:01AM

        by Anonymous Coward on Monday March 23 2020, @01:01AM (#974270)

        A knife can't curtail the freedoms of millions of people. And the existence of a knife doesn't make it easier for others to abuse them.

      • (Score: 0) by Anonymous Coward on Monday March 23 2020, @06:02PM

        by Anonymous Coward on Monday March 23 2020, @06:02PM (#974504)

        It's not what it is, it's how you use it.

        Mass surveillance necessarily threatens democracy and freedom. [gnu.org] Human history shows that when given absolute power, humans will abuse it. There is no way to have mass surveillance and for it to not be abused. Existing government, which already have some forms of mass surveillance, demonstrate that fact perfectly.

        Mass surveillance of all forms must be banned, for freedom and democracy. No amount of safety - real or imaginary - is worth allowing it.

    • (Score: 2) by Grishnakh on Monday March 23 2020, @04:16AM (1 child)

      by Grishnakh (2831) on Monday March 23 2020, @04:16AM (#974315)

      and only occasionally allude to the inevitable abuses and destruction of democracy.

      What democracy? China never had any democracy that could be destroyed.

      If you're concerned about it coming here, you shouldn't be. We have democracy here. So if it does come here and is put in place by our government, that's because we the people wanted it, as seen by how we vote. If we didn't want it, we would be sure to vote out anyone who tries to bring dystopian mass surveillance here.

      • (Score: 0) by Anonymous Coward on Monday March 23 2020, @05:59PM

        by Anonymous Coward on Monday March 23 2020, @05:59PM (#974502)

        If you're concerned about it coming here, you shouldn't be. We have democracy here.

        No, the US functions more as an oligarchy.

        So if it does come here and is put in place by our government, that's because we the people wanted it, as seen by how we vote. If we didn't want it, we would be sure to vote out anyone who tries to bring dystopian mass surveillance here.

        The problem is, the same morons who succumb to propaganda and vote in favor of totalitarianism affect the rest of us, and our Supreme Court is packed with authoritarians. If the Supreme Court can get away with approving of Japanese internment camps and other abominations, despite them being blatantly and absolutely unconstitutional, it can get away with anything.

        So of course I'm going to be worried. We barely have a democracy currently, and idiots will vote in favor of destroying what little democracy we do have.

  • (Score: 2) by ilsa on Sunday March 22 2020, @01:03PM (1 child)

    by ilsa (6082) Subscriber Badge on Sunday March 22 2020, @01:03PM (#974115)

    has reached 95 percent accuracy in lab tests, and even claims that it is more accurate in real life, where its cameras take multiple photos of a person if the first attempt to identify them fails.

    While I have knowledge of the details of this system, that one statement alone makes me call shenanigans. I wouldn't be surprised if the accuracy is a good 25% lower than stated, if not lower still.

    There is no way you can implement a system like this without adequate testing*, and you certainly don't just go from single- to multiple-image based recognition willy nilly at the drop of a hat. It's exceedingly unlikely that RL performed better than lab where you can do a much better job of controlling the variables.

    He either has no idea how his own system works, or he's lying through his teeth.

    *I mean, you can, if integrity and quality arn't a priority.

    • (Score: 0) by Anonymous Coward on Sunday March 22 2020, @07:00PM

      by Anonymous Coward on Sunday March 22 2020, @07:00PM (#974193)

      Um. If the lab's train/test has single snapshots in the tests, I can assure you that improved geometries and textures are retrieved from (multiple stills extracted from) video, assuming state of the art. The counterclaim would be much more extraordinary.

      Of course phrasing it as "in real life is better than in the lab" is disingenuous; they should be using video not stills in the lab (train and) test sets.

  • (Score: 0) by Anonymous Coward on Sunday March 22 2020, @06:57PM

    by Anonymous Coward on Sunday March 22 2020, @06:57PM (#974191)

    Counter-intuitively, training facial recognition algorithms to recognize masked faces involves throwing data away. A team at the University of Bradford published a study last year showing they could train a facial recognition program to accurately recognize half-faces by deleting parts of the photos they used to train the software.

    [emphasis mine]

    How tf is it counter-intuitive that, to train a model to match when masks are worn, it's best to mask out the masks in the training data? It's literally right there in the photoshop/gimp verb form of "mask".

(1)