Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday March 17 2017, @03:46AM   Printer-friendly
from the I-don't-believe-you dept.

There are facts, and there are beliefs, and there are things you want so badly to believe that they become as facts to you.

The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn't show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as "motivated reasoning." Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

[...] People see evidence that disagrees with them as weaker, because ultimately, they're asking themselves fundamentally different questions when evaluating that evidence, depending on whether they want to believe what it suggests or not, according to psychologist Tom Gilovich.

[...] In 1877, the philosopher William Kingdon Clifford wrote an essay titled "The Ethics of Belief" [PDF], in which he argued: "It is wrong always, everywhere, and for anyone to believe anything on insufficient evidence."

[...] All manner of falsehoods—conspiracy theories, hoaxes, propaganda, and plain old mistakes—do pose a threat to truth when they spread like fungus through communities and take root in people's minds. But the inherent contradiction of false knowledge is that only those on the outside can tell that it's false. It's hard for facts to fight it because to the person who holds it, it feels like truth.

[...] In a New York Times article called "The Real Story About Fake News Is Partisanship", Amanda Taub writes that sharing fake news stories on social media that denigrate the candidate you oppose "is a way to show public support for one's partisan team—roughly the equivalent of painting your face with team colors on game day."

This sort of information tribalism isn't a consequence of people lacking intelligence or of an inability to comprehend evidence. Kahan has previously written that whether people "believe" in evolution or not has nothing to do with whether they understand the theory of it—saying you don't believe in evolution is just another way of saying you're religious. Similarly, a recent Pew study found that a high level of science knowledge didn't make Republicans any more likely to say they believed in climate change, though it did for Democrats.

[...] People also learn selectively—they're better at learning facts that confirm their worldview than facts that challenge it. And media coverage makes that worse. While more news coverage of a topic seems to generally increase people's knowledge of it, one paper, "Partisan Perceptual Bias and the Information Environment," showed that when the coverage has implications for a person's political party, then selective learning kicks into high gear.

[...] Fact-checking erroneous statements made by politicians or cranks may also be ineffective. Nyhan's work has shown that correcting people's misperceptions often doesn't work, and worse, sometimes it creates a backfire effect, making people endorse their misperceptions even more strongly.

[...] So much of how people view the world has nothing to do with facts. That doesn't mean truth is doomed, or even that people can't change their minds. But what all this does seem to suggest is that, no matter how strong the evidence is, there's little chance of it changing someone's mind if they really don't want to believe what it says. They have to change their own.

https://www.theatlantic.com/science/archive/2017/03/this-article-wont-change-your-mind/519093/

[Related]:

The Nature and Origins of Misperceptions: [PDF]

THE POLITICS OF MOTIVATION [PDF]

Behavioral receptivity to dissonant information

"A man with a conviction is a hard man to change" [PDF]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday March 17 2017, @01:33PM (1 child)

    by Anonymous Coward on Friday March 17 2017, @01:33PM (#480384)

    "Okay, yeah, the data show Group X has a lower collective IQ than Group Y. And...so fucking what? Where do you get off claiming that members of X are somehow undeserving of life compared to Y because of that? That is a blatant non-sequitur."

    That argument isn't persuasive. If a group is provably different from another group in a meaningful way, which is the entire point of anyone making these arguments, then it follows that they should be treated in a way that takes that difference into consideration. And even if that doesn't lead all the way to eugenics, it does give moral support to paternalistic social policy like Charles Murray's The Advantages of Social Apartheid. [aei.org]

    The powerful are always looking to find ways to justify and expand their dominance. Science is seen as a neutral arbiter of truth in the modern world, so the powerful will cloak their justifications in scientism and there will always be people willing to uncritically go along because it suits their prejudices. You can make all the moral arguments about personhood that you want, but if the science is bent to 'prove' that one group has less personhood than another that will be convincing, just like it is convincing at a more mundane level to say that fair-skinned people are more susceptible to sunburn than their melanin-rich counterparts.

  • (Score: 2) by Azuma Hazuki on Friday March 17 2017, @11:40PM

    by Azuma Hazuki (5086) on Friday March 17 2017, @11:40PM (#480700) Journal

    My argument *is* a scientific one though, or at least a logical one. It's specifically saying "you are overextending your epistemological reach here. You cannot get ought from is in this manner." The facts *are* neutral; science itself, because it is a human endeavour, cannot help but be at least somewhat politicized, but the deliberate abuse of science and statistics has got to stop. Those tools were never meant to be used this way, any more than a knitting needle is designed for eating soup, and willful misuse of them leads to a Pandora's Box of clinging, deep-burrowing evil.

    --
    I am "that girl" your mother warned you about...