Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Thursday May 18 2017, @05:19AM   Printer-friendly
from the illusions-michael dept.

Humans treat 'inferred' visual objects generated by the brain as more reliable than external images from the real world, according to new research published in eLife.

The study, from the University of Osnabrück, Germany, reveals that when choosing between two identical visual objects -- one generated internally based on information from the blind spot and an external one -- we are surprisingly likely to show a bias towards the internal information.

To make sense of the world, humans and animals need to combine information from multiple sources. This is usually done according to how reliable each piece of information is. For example, to know when to cross the street, we usually rely more on what we see than what we hear -- but this can change on a foggy day.

"In such situations with the blind spot, the brain 'fills in' the missing information from its surroundings, resulting in no apparent difference in what we see," says senior author Professor Peter König, from the University of Osnabrück's Institute of Cognitive Science. "While this fill-in is normally accurate enough, it is mostly unreliable because no actual information from the real world ever reaches the brain. We wanted to find out if we typically handle this filled-in information differently to real, direct sensory information, or whether we treat it as equal."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by AthanasiusKircher on Thursday May 18 2017, @02:20PM

    by AthanasiusKircher (5291) on Thursday May 18 2017, @02:20PM (#511685) Journal

    Confirmation bias is a beautiful thing.

    That's what I thought, too. Yes, this particular example is a pretty "low-level" processing thing in the brain (or, at least, that's probably how we'd term it, even though what the brain is doing is quite sophisticated).

    But generally speaking, humans tend to place more weight on their "inferred" beliefs than empirical data (both direct sensory data and abstract data as in stuff collected for an experiment), and in fact we frequently tend to -- generally unconsciously -- try to make any data mold to our existing beliefs, hence confirmation bias.

    TFA seems to agree:

    The team's interpretation is that subjects compare the internal representation (or 'template') of a continuous stimulus against the incoming sensory input, resulting in an error signal which represents the mismatch. In the absence of real information, no deviation and therefore no error or a smaller signal occurs, ultimately leading to a higher credibility at the decision-making stage.

    In other words, stuff in the "blind spot" is stuff the brain KNOWS is true (because it's generated internally, so it IS the mental representation by definition), so it doesn't need to think about it. Processing actual data takes more effort. This seems akin to most human concepts and beliefs: once you've come to a conclusion about how the "world works" and how your internal model ("template") represents it, that's basically how you interpret all future incoming data. Data that conflicts or is falls outside this model is implicitly judged as less reliable. Seems to me that this principle could be extended to political or religious beliefs (which are basically complex "templates" for interpreting the world), etc. just as well.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3