Just
a reminder that the purpose of this discussion aims at stimulating
thought and self awareness as tools
to help those in recovery
from trauma learn how to make safer
choices. To make the discussion more jocular, we've defined
Cognitive Biases as “CranioRectal
Inversions” (CRI).
The
previous post discussed the problem of thinking that people think and
do the same things that we do, though this is not true
about every kind of belief. Social support plays a
role in how we justify ourselves and how we perceive reality, and it
works to help bolster our feelings about ourselves and our decisions.
But what do we know about what weakens or strengthens a False
Consensus Bias?
Stronger
Bias
In
general, the False Consensus is stronger when the matter at hand is
of great importance to a person, and if they are intimately linked to
it somehow. (A person is less affected by false consensus if they
are asked in what company they would invest than they were if asked
to choose between names they liked for a new child.)
Matters
of faith prove to be very powerful and enhance a False Consensus, and
it affects all aspects of the process of thought. A person who is a
political conservative, for example, will be selective about what
material they consider, favoring sources that already support their
own views. They are also more likely to recall examples from their
memory because they are more closely attached to them while
conveniently underestimating examples of opposing views. There is a
sense of competition of sorts, so they are easily given to biases and
fallacies which allow them to discount an opposing view. Even
arguments which consider opposing views are processed as a 'know thy
enemy' tactic instead of a fair hearing.
Conformity
and Survival
In a previous post, we mentioned the
idea that if we put more emphasis on external factors (an external
locus of control), we are quick to make those external pressures
responsible for our fate. Our false consensus tendency drops
significantly if people actually do share our stance. But if some
factor that is internal and individually unique to us falls into
question, we tend to take more of a personal responsibility for our
fate and choices.
A factor that is overlooked,
especially in high demand situations, concerns the consistency of
shared understanding. We may interpret information or situations
very differently than others, contributing to ambiguity. When making
quick estimates under such conditions, we tend not to notice that
things aren't terribly clear, so we overlook the disconnect. It has
been describe as something of and instinctual reflex that happens so
quickly, this ambiguity can weaken our biases, not all of which may
be a bad thing. People tend to willingly trust that they're sharing
the same understanding as those around them which changes their
affinity for false consensus. Imagine how potent this pressure
becomes within a cult.
I suppose that it has something to do
with a survival response wherein we wonder what others know that we
don't, and conformity overrides our false consensus tendency. The
more unsure we feel about a situation and how others around us
behave, we tend to cave into pressure. When we perceive that our
beliefs are supported by a majority, we will stick to our convictions
more strongly, but the opposite also proves to be true. This, of
course, brings us to the Asch Experiment.
As the video describes, Solomon Asch
studied the affects of social pressure on how a subject responded
when identifying the lengths of line in an test example. The panel
was stocked with confederates who deliberately gave the wrong
responses which caused many of the subjects to report the wrong
response to be consistent with the group. Such is an example of how
false consensus can weaken behavior.
Strong
Tendency to Avoid Giving Negative Feedback
Imagine
that you are emerging from a closed society or social group, and you
don't understand much about the subtle nuances of everyday life.
Well, things are going to be a bit tough. Research seems to indicate
that we are not very skilled as humans to giving negative feedback.
Though very close friends and family usually feel free to do so in a
relationship of trust, we seem to pervasively avoid being honest
about feedback when it is negative. We are less direct about it, we
are slow to offer it, and we tend to avoid broaching disagreements.
It doesn't even need to be a situation that might cause negative
repercussions for us in some way. Most people are polite and are
just not good at offering feedback. How much more difficult does
that make transition into a new society for someone exiting a cult? But we need feedback to help us avoid the False Consensus trap.
This
topic is vast, and I'll just defer to Thomas Gilovich in How
We Know What Isn't So in his summary statement at the end of
the chapter discussing biases that arise from social influences:
Because so much disagreement remains hidden, our beliefs are not properly shaped by healthy scrutiny and debate. The absence of such argument also leads us to exaggerate the extent to which other people believe the way that we do. Bolstered by such a false sense of social support, our beliefs strike us as more warranted than is actually the case, and they become rather resistant to subsequent logical and empirical challenge.
For
Further Reading until the next post:
- One of the $3 Kindle books about Cognitive Bias at Amazon.com
- Gilovich, Griffin & Kahneman's Heuristics and Biases
- Gilovich's How We Know What Isn't So
- Robert Cialdini's Influence: The Psychology of Persuasion
- Judith Herman's Trauma
and Recovery