Re: Should we preach to anyone but the choir?
There is a saying that everyone is entitled to their own opinions, but
not their own facts. These days, it seems more and more that what
people consider to be facts is diverging. This is most easy to observe
in the political realm, but its also becoming common in less-directly
political ares, such as teach curriculum, the climate debate, etc. It's
not just that people are expressing disagreements over values. Those
kinds of disagreements over what is right or wrong, or what is more
important than what, have always been with us, and always will be. And,
arguably, are a good thing, since they get us to consider other points
of view. I'm also not just talking about disagreements over future
predictions, or estimated effects of one action or another. Again,
predicting uncertain events in the future will always be difficult and
will always lead to disagreement. What I'm talking about here is more
disagreement about what's actually true, right now (or in the past),
right in front of us.
I'm certainly not the first to point this out, and I'm sure I won't be
the last. So I'm not going to get into too many examples, or try to
argue who's right in which case. I'm actually more interested in the
situation itself. How does it come about, and can anything done about
it? I've been noodling on it a bit today, and thought I'd run some
ideas by you guys to see what you thought.
First, most of the things we see this kind of disagreement on (though
not all) are complicated things. The data is there for anyone to look
at, but there's so much data, related in non-trivial ways, that 99.9% of
us won't go to the trouble of looking at it. We'll get a summary handed
to us second hand from a source we trust, and accept their analysis as
truth. Sure, we could go look at the numbers ourselves, but why bother?
The source has already done it for us, and we know they wouldn't try to
mislead us.
This means which sources we trust will make a big difference in what we
consider to be true. But how do we decide what sources we trust?
Usually based off what they tell us. If they tell us thing that seem
true to us, we're likely to believe them. If they tell us things that
don't match our beliefs, we're likely to distrust them. This leads to a
positive feedback loop, where once we decide one source is trustworthy,
we start accepting what they say as true, dismissing those who disagree
with them, and trusting those who agree with them. Beyond the initial
"choice" (I put this in quotes because I don't think it's actually a
conscious decision) of who to trust, it sort of progresses naturally on
its own, without our direction.
Add to this the echo chamber effect, where source X says something, Y
hears it and trusts X so repeats it as well, then Z hears X and Y saying
it, so thinks is couldn't possibly be wrong and so repeats it as well.
Then it gets back to X, who says "it's not just me saying it, these
other folks have looked into it and reached the same conclusion!" when
really it's all back to just X saying it in the first place. This has
become more common in the "new media" world, where "trusted sources" are
just as likely to be random bloggers reading each others blogs as
independent news sources.
Another factor is tribalism, or us-and-them-ism. There's not just
trusted sources telling you what you already believe, but also strongly
distrusted sources sayign the opposite. But instead of making us less
confident, this actually reinforces our beliefs. "If X said it it's
true, then I know it has to be false!" A voice of disagreement in such
cases actually makes us more confident we're right. This makes it
particularly difficult to escape the echo chamber, as voices from the
"outside" trying to point out any errors in our thinking actually
strengthen our conviction.
It would seem the only way to escape the echo chamber is from someone
inside it to realize an error, and point it out to those who trust them.
But this doesn't always work. Often, instead of giving up our views, we
give up our trust. We view them as a "traitor" or the like, someone
who's betrayed the truth. You stop trusting them, rather than
considering what they say. Voices of disagreement from within get
pushed quickly into the "them" category, and ignored at best, or viewed
as pariahs at worst.
As I said, none of this is new insight. Plenty of people have made the
same observations countless times in the last few years (and I'd guess
if we looked we could find people saying similar things decades and
centuries ago too). The question is, what can we do about it? How can
we avoid falling into the same trap? Is it even possible? Is there
much to do besides "pick a side and hope it's the right one?"
One option is to look at the data itself, rather than trust sources. In
an ideal world, that's what we'd do. But it's time consuming, hard
work, and often we don't have the expertise to do the necessary
analysis. I'd do this more when I was at my old job (when I had more
time, and better software for crunching numbers), and still try to do it
whenever I can these days, but for the vast majority of things, I just
don't have time for it. I've got a scientific background, but only in a
fairly narrow field. I'm not an economist, a climate scientist, a
census taker, etc. I can do my rudimentary checks, but to a degree I
don't consider myself a trusted source on many of these
questions.
Another option is to make a conscious effort to read/listen to sources
outside our own echo chamber. The danger here, though, is the tribalism
effect. If we intentionally go looking for sources we expect to
disagree with, instead of openning our mind, it can actually re-inforce
our views by confirming our expectations of "them". Finding sources you
sometimes disagree with can be useful, I think, as it keeps you
from expecting to agree or disagree up front, and forces you to evaluate
each point on a case by case basis. But it can be tough to find sources
that don't tend to fall into the "us" or "them" categories most of the
time.
What do you guys think? Is there much hope for objectivity, or are we
doomed by human nature to this kind of source picking? Does it help
open our minds to talk to those of different views, or does it just make
us stronger believers?
We all think we're right, and that we're open minded and have selected
the "correct" sources, and yet we tend to end up with different sources.
It's easy to say "well, those guys believe crazy stuff because they
listen to X, Y, and Z!" but it doesn't really convince them that their
"crazy" views are wrong (let alone crazy), and it doesn't help us come
to any kind of agreement.
Again, I'm not just talking about those things that we're probably
always going to have disagreement about: religion, the future,
morality, etc. I'm talking about objective reality stuff that we can't
seem to agree on. Is the planet getting warmer? Does radiometric
dating work? Are taxes lower today than they were under Reagan? Was
Obama born in Hawaii? Did Fannie and Freddie do most of the subprime
lending, or did private banks do more? Is the "heat index" a new term
introduced just recently by the government? Do most biologists consider
evolution to be a fact? All these kinds of questions and more divide us
into "us and them" positions, and evidence doesn't really seem to play
much part in the discussion, because the sides can agree on what sources
to trust to provide the data in the first place. What can be done about
it? Is there any hope for objectivity?
People can, and do, change their minds about deeply held beliefs. But
in my experience, it's not usually new data that causes them to do so,
but rather a new source. It takes someone they can trust telling them
to consider a different position. Or, perhaps more to the point,
someone they trust telling them that this other position isn't
necessarily in opposition to all the other things they believe. For
example, you can accept X that "they" say is true, without actually
having to agree with them on Y, and Z as well. But often that voice
ends up being discarded and untrusted when it calls on us to consider
the possibility that we're wrong. What makes the difference between the
two cases? Why do we listen some times, and abandon the source in
others?
This brings up another factor I didn't mention before: the "clumping" of
ideas. Because of the importance of sources, positions on unrelated
issues often get linked. There's no real reason why the view that
climate change is a hoax should be correlated with pro-life views, or
that opposition to replacing social security with investment accounts
should correlate with acceptance of evolution, but ideas like these do
tend to end up correlating. In part, I would argue, because a source
that doesn't fit neatly into the the "us" or "them" category won't be as
influential in affecting our views. So once camps have defined
themselves, we end up with the somewhat illogical view that if we change
our mind about X, then all our other beliefs should suddenly be in doubt
as well, because if so-and-so was wrong about X, well, what else could
they have been wrong about? The result of this is that we start to
think "well, I don't know much about X, but I know so-and-so was right
about Y, so I'm sure he's right on this too!"
So, a bit of a ramble, I know, and not really ending with a clear
question. But what do you guys think?