Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday December 13 2015, @09:45PM   Printer-friendly
from the just-check-if-they-are-horizontal dept.

PsychCentral has a decent summary of a recent software-based effort from University of Michigan to discover who's lying and who's not.

By carefully observing people telling lies during high-stakes court cases, researchers at the University of Michigan are developing unique lie-detecting software based on real-world data.

Their lie-detecting model considers both the person's words and gestures, and unlike a polygraph, it doesn't need to touch the speaker in order to work.

In experiments, the prototype was up to 75 percent accurate in identifying who was telling a lie (as defined by trial outcomes), compared with humans' scores of just above 50 percent. The tool might be helpful one day for security agents, juries, and even mental health professionals.

To develop the software, the researchers used machine-learning techniques to train it on a set of 120 video clips from media coverage of actual trials. Some of the clips they used were from the website of The Innocence Project, a national organization that works to exonerate the wrongfully convicted.

[More after the break.]

Researchers found that the people who were lying had a number of distinctive tells. They moved their hands more, scowled or grimaced, said "um" more frequently, and attempted to create a sense of distance between themselves and their alleged crime or civil misbehavior by using words like "he" or "she" rather than "I" or "we." Even more interesting, liars tended to make a greater effort at sounding sure of themselves — not only would they feign confidence, but they would also look the questioner in the eye, perhaps attempting to establish believability.

"In laboratory experiments, it's difficult to create a setting that motivates people to truly lie. The stakes are not high enough,...We can offer a reward if people can lie well — pay them to convince another person that something false is true. But in the real world there is true motivation to deceive. People are poor lie detectors. This isn't the kind of task we're naturally good at. There are clues that humans give naturally when they are being deceptive, but we're not paying close enough attention to pick them up."

"It was 75 percent accurate in identifying who was lying. That's much better than humans, who did just better than a coin-flip."

"The system might one day be a helpful tool for security agents, juries and even mental health professionals."

I have to imagine this is a child's game compared to what Three Letter Agencies have developed.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by pipedwho on Monday December 14 2015, @01:13AM

    by pipedwho (2032) on Monday December 14 2015, @01:13AM (#275937)

    Assuming someone is lying based on the outcome of the case (ie. guilty/not-guilty verdict) is a pretty poor metric to calibrate against. That 75% could easily be eaten back to 50% on the assumption that there is a potentially high rate of convicting innocent people.

    The other problem is confirmation bias on cases that have been put forward based on an already good belief on the part of the prosecutor that the case has sufficient merit to convict a person pleading 'not guilty'. That means the person is already assumed (by the prosecuting attorneys/police) by definition to be lying about their guilt (otherwise, most of the time the case would have been dropped to avoid wasted time in court).

    So the claim that this software is 75% accurate seems highly suspicious, and most likely significantly overstating its accuracy.

    And even if it were truly 75% accurate, that value is still nowhere near sufficient to use for anything beyond large scale population analysis. Possibly to show that X% of convictions/acquittals were incorrectly decided - with no way to accurately determine which are which.

    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  

    Total Score:   3