Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday March 17 2017, @03:46AM   Printer-friendly
from the I-don't-believe-you dept.

There are facts, and there are beliefs, and there are things you want so badly to believe that they become as facts to you.

The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn't show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as "motivated reasoning." Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

[...] People see evidence that disagrees with them as weaker, because ultimately, they're asking themselves fundamentally different questions when evaluating that evidence, depending on whether they want to believe what it suggests or not, according to psychologist Tom Gilovich.

[...] In 1877, the philosopher William Kingdon Clifford wrote an essay titled "The Ethics of Belief" [PDF], in which he argued: "It is wrong always, everywhere, and for anyone to believe anything on insufficient evidence."

[...] All manner of falsehoods—conspiracy theories, hoaxes, propaganda, and plain old mistakes—do pose a threat to truth when they spread like fungus through communities and take root in people's minds. But the inherent contradiction of false knowledge is that only those on the outside can tell that it's false. It's hard for facts to fight it because to the person who holds it, it feels like truth.

[...] In a New York Times article called "The Real Story About Fake News Is Partisanship", Amanda Taub writes that sharing fake news stories on social media that denigrate the candidate you oppose "is a way to show public support for one's partisan team—roughly the equivalent of painting your face with team colors on game day."

This sort of information tribalism isn't a consequence of people lacking intelligence or of an inability to comprehend evidence. Kahan has previously written that whether people "believe" in evolution or not has nothing to do with whether they understand the theory of it—saying you don't believe in evolution is just another way of saying you're religious. Similarly, a recent Pew study found that a high level of science knowledge didn't make Republicans any more likely to say they believed in climate change, though it did for Democrats.

[...] People also learn selectively—they're better at learning facts that confirm their worldview than facts that challenge it. And media coverage makes that worse. While more news coverage of a topic seems to generally increase people's knowledge of it, one paper, "Partisan Perceptual Bias and the Information Environment," showed that when the coverage has implications for a person's political party, then selective learning kicks into high gear.

[...] Fact-checking erroneous statements made by politicians or cranks may also be ineffective. Nyhan's work has shown that correcting people's misperceptions often doesn't work, and worse, sometimes it creates a backfire effect, making people endorse their misperceptions even more strongly.

[...] So much of how people view the world has nothing to do with facts. That doesn't mean truth is doomed, or even that people can't change their minds. But what all this does seem to suggest is that, no matter how strong the evidence is, there's little chance of it changing someone's mind if they really don't want to believe what it says. They have to change their own.

https://www.theatlantic.com/science/archive/2017/03/this-article-wont-change-your-mind/519093/

[Related]:

The Nature and Origins of Misperceptions: [PDF]

THE POLITICS OF MOTIVATION [PDF]

Behavioral receptivity to dissonant information

"A man with a conviction is a hard man to change" [PDF]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday March 17 2017, @03:09PM (1 child)

    by Anonymous Coward on Friday March 17 2017, @03:09PM (#480430)

    > Buy guns.

    That's the last option, only useful when we've already gone over the cliff.
    The flipside of the same authoritarian coin.

  • (Score: 1) by kurenai.tsubasa on Friday March 17 2017, @08:56PM

    by kurenai.tsubasa (5227) on Friday March 17 2017, @08:56PM (#480621) Journal

    When we go over the cliff, you won't be able to buy guns anymore if you're an undesirable.

    Maybe we're not as close as I think we are. Don't give up on the ballot box just yet. In four years, maybe this will all seem like a bad dream. But be ready for the ballot box not to be enough. The jury box and soap box haven't been working in a while.