Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Saturday February 15 2020, @09:51PM   Printer-friendly
from the Gurdjieff-taught-it-is-not-so-easy-to-remember-yourself dept.

A weekly financial newsletter included this link, https://www.titlemax.com/discovery-center/lifestyle/50-cognitive-biases-to-be-aware-of-so-you-can-be-the-very-best-version-of-you/. A cute graphical "flash card" version of the same list is available at https://www.visualcapitalist.com/50-cognitive-biases-in-the-modern-world/ Each "card" includes a short example that I found helpful in understanding the definitions.

Along with the ever-popular Dunning-Kruger Effect, the list had some eye openers for me. Here are the first ten. As a mental exercise, think about how many more you are aware of...before going to either of the links for a peek:

1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.

2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.

3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.

4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.

5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.

6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)

7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.

8. False Consensus: We believe more people agree with us than is actually the case.

9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.

10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.

At some level, I suppose this is click-bait--but this bait got me thinking.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by Anonymous Coward on Sunday February 16 2020, @01:36AM (13 children)

    by Anonymous Coward on Sunday February 16 2020, @01:36AM (#958650)

    Here's the rest of the 50, copy paste didn't get the numbers.

    I know I have trouble with this one, in meetings I have to keep listening after I hear something that sounds "right"--
        Anchoring: We rely heavily on the first piece of information introduced when making decisions.

    Availability Heuristic: We rely on immediate examples that come to mind while making judgments.

    Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.

    Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.

    Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.

    Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.

    Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.

    Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.

    Anchoring: We rely heavily on the first piece of information introduced when making decisions.

    Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.

    Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.

    Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.

    Confirmation Bias: We tend to find and remember information that confirms our perceptions.

    Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.

    Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.

    Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.

    Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.

    Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.

    Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.

    Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.

    Gambler’s Fallacy: We think future possibilities are affected by past events.

    Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.

    Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.

    Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.

    Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.

    Authority Bias: We trust and are more often influenced by the opinions of authority figures.

    Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.

    Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.

    Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.

    Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.

    Zeigarnik Effect: We remember incomplete tasks more than completed ones.

    IKEA Effect: We place higher value on things we partially created ourselves.

    Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.

    Bystander Effect: The more other people are around, the less likely we are to help a victim.

    Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.

    False Memory: We mistake imagination for real memories.

    Cryptomnesia: We mistake real memories for imagination.

    Clustering Illusion: We find patterns and “clusters” in random data.

    Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.

    Optimism Bias: We sometimes are over-optimistic about good outcomes.

    Blind Spot Bias: We don’t think we have bias, and we see it others more than ourselves.

    Starting Score:    0  points
    Moderation   +3  
       Informative=3, Total=3
    Extra 'Informative' Modifier   0  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Sunday February 16 2020, @01:56AM (2 children)

    by Anonymous Coward on Sunday February 16 2020, @01:56AM (#958657)

    The numbers didn't copy because they're fucking images. I don't even...

    Also they have one I know I don't have! To whit:

    Blind Spot Bias: We don’t think we have bias, and we see it others more than ourselves.

    Definitely I don't. Maybe other people have this one?

    • (Score: 0) by Anonymous Coward on Sunday February 16 2020, @03:47AM

      by Anonymous Coward on Sunday February 16 2020, @03:47AM (#958680)

      Bias Bias : We become obsessed with biases, we make irrational decisions and have sex with cats.

      maybe I just have biasophobia?

    • (Score: 1) by khallow on Sunday February 16 2020, @04:12AM

      by khallow (3766) Subscriber Badge on Sunday February 16 2020, @04:12AM (#958689) Journal

      Definitely I don't. Maybe other people have this one?

      I'm pretty sure other people have that one.

  • (Score: 3, Insightful) by Coward, Anonymous on Sunday February 16 2020, @02:30AM (6 children)

    by Coward, Anonymous (7017) on Sunday February 16 2020, @02:30AM (#958664) Journal

    Can we just summarize this as humans are fallible?

    • (Score: 0) by Anonymous Coward on Sunday February 16 2020, @03:09AM (1 child)

      by Anonymous Coward on Sunday February 16 2020, @03:09AM (#958672)

      Where's the -1 party pooper mod?

      • (Score: 3, Funny) by Coward, Anonymous on Sunday February 16 2020, @04:07AM

        by Coward, Anonymous (7017) on Sunday February 16 2020, @04:07AM (#958687) Journal

        That would count as #3 In-Group Favoritism. Also Naïve Realism, Availability Cascade, Law of Triviality, and possibly IKEA Effect.

    • (Score: 0) by Anonymous Coward on Monday February 17 2020, @04:04PM

      by Anonymous Coward on Monday February 17 2020, @04:04PM (#959190)

      Can we just summarize this as humans are fallible?

      That's like setting your jpeg compression settings to "make the whole picture a shade of gray and call it a day".

    • (Score: 2) by hendrikboom on Saturday February 22 2020, @12:55PM (2 children)

      by hendrikboom (1125) Subscriber Badge on Saturday February 22 2020, @12:55PM (#961008) Homepage Journal

      Can we just summarize this as humans are fallible?

      Only if we don't want to compensate for bias in order to become more objective.

      • (Score: 3, Insightful) by Coward, Anonymous on Sunday February 23 2020, @08:55AM (1 child)

        by Coward, Anonymous (7017) on Sunday February 23 2020, @08:55AM (#961350) Journal

        The errors I catch myself making and also see in others are more basic than some subtle "bias". Correcting for bias is like making declination adjustments to a compass reading. First we need to know which end of the needle points North, and which end points South.

  • (Score: 4, Informative) by theluggage on Sunday February 16 2020, @12:32PM (2 children)

    by theluggage (1797) on Sunday February 16 2020, @12:32PM (#958767)

    I think TFA has provided a good name for one of the missing options:

    Flashcard logic: basing an argument on over-simplistic bullet lists of 'cognitive biases' or 'fallacies' (with a side-order of getting the two confused) and name-checking the effect without looking critically at its application.

    Nothing in the list is wrong but there are a lot of unstated assumptions, context issues and potential abuses. Also, its a bit of a mix of "logical fallacies" and descriptions of human tendencies. E.g. to cherry pick (oops!) a few:

    "Status Quo Bias" can be abused to support change for the sake of change, or "worse is the new better" which seems to be the current trend in software. If you're proposing change, its your job to prove that it's better than the status quo.

    "Sunk Costs Fallacy" - presumes that the past "investments" have been wasted and that completing the project has no additional benefits. E.g. those arrangement fees you sent to the former Nigerian finance minister are probably sunk costs because you know they're just gone. On the other hand, if you spend that last $1m on the bridge that's gone over-budget then you'll have a working bridge worth $$$ to the local economy. Continue paying into that endowment policy for another 10 years until it matures and hopefully makes a gain or cash it in now at a loss because "sunk costs"? You'll just have to do the math - just like in any potential "sunk costs" scenario the devil is the details. Very easy to abuse in order to defend "not invented here" or "new broom" scenarios.

    "Gambler’s Fallacy": like all the probability taught in high school, this assumes that the events are truly random and independent - unless you're literally dealing with something like the roll of a "fair" die, future possibilities are mostly affected by past events. Forgetting this has had tragic consequences [wikipedia.org].

    "Authority Bias" - as stated presumes that authority is wrong. In the real world, the "authority" is more likely to be experienced or knowledgable, even if there are plenty of notable exceptions! The fallacy is continuing to favour the authority position in the face of stronger evidence to the contrary. If someone tells you that Newton's laws of motion are incorrect, that someone is probably wrong. If they then come up with a pile of experimental evidence about the orbit of Mercury and weird shit happening with interferometers and you still insist that Newton was beyond question then that's "Authority Bias".

    "Placebo Effect" - don't knock it. In pretty much any condition, anything that reduces stress or anxiety will at lesat have a palliative effect. The problem is when the effect is used to promote anti-science. If lying down listening to whalesong while someone waves a crystal over you makes you feel better, go with it, just remember that the whale is probably talking more sense than the crystal-waver pattering on about auras. Just listen to your doctor as well (unless they tell you that the answer is more opiates, because the health industry is not immune to a spot of anti-science).

    "Bystander Effect": because 21 calls to 911 are always so much more effective than just 20. As opposed to "let me through: I have Dunning-Krueger!"

    And so on, with the normal problem being not that the underlying effect or fallacy is wrong, but the one-line summary omits important caveats (which you probably will see if you dig deeper, follow the links etc). With the "fallacies" the problem is often that they are only fallacious when used in an attempt to rebut better argument in a debate with a falsifiable/provable outcome.
     

    • (Score: 2) by hendrikboom on Saturday February 22 2020, @01:01PM (1 child)

      by hendrikboom (1125) Subscriber Badge on Saturday February 22 2020, @01:01PM (#961010) Homepage Journal

      My doctor occasionally prescribes a placebo, and tells me that it's a placebo, because I believe in the placebo effect and she knows it.

      Sometimes a placebo is the most effective treatment. And no side effects!

      • (Score: 3, Funny) by theluggage on Saturday February 22 2020, @05:44PM

        by theluggage (1797) on Saturday February 22 2020, @05:44PM (#961076)

        Sometimes a placebo is the most effective treatment. And no side effects!

        Well, it is the standard against which all other medicines are judged...

        However, it's disturbing that you haven't heard of the side effects of placebos - folks these days don't know how to use youtube to learn themselves better factoids! Placebos are well known to cause hypochondria, and some contain sucrose - a highly dangerous and possibly addictive substance which is at the centre of a number of major public health issues (and the synthetic substitutes are even worse!). Liquid placebos often contain dangerous levels of dihydrogen monoxide too, and could kill you if injected directly into your bloodstream with a dirty needle.

        /s