What is at the root of denial? A Must Read from Chris Mooney in Mother Jones

Chris Mooney has been exploring the basic underpinnings of denialism lately, with this latest article a good summary of the basic problems:

In a recent study of climate blog readers, Lewandowksy and his colleagues found that the strongest predictor of being a climate change denier is having a libertarian, free market world view. Or as Lewandowsky put it in our interview, “the overwhelming factor that determined whether or not people rejected climate science is their worldview or their ideology.” This naturally lends support to the “motivated reasoning” theory—a conservative view about the efficiency of markets impels rejection of climate science because if climate science were true, markets would very clearly have failed in an very important instance.

But separately, the same study also found a second factor that was a weaker, but still real, predictor of climate change denial—and also of the denial of other scientific findings such as the proven link between HIV and AIDS. And that factor was conspiracy theorizing. Thus, people who think, say, that the Moon landings were staged by Hollywood, or that Lee Harvey Oswald had help, are also more likely to be climate deniers and HIV-AIDS deniers.

This is similar to what we’ve been saying for years. Ideology is at the heart of antiscience, (yes even liberal ideology) and when in conflict with science will render the ideologue incapable of rational evaluation of facts. The more extreme the ideology, the more likely and more severe the divergence from science. Then there is the separate issue of cranks who have a generalized defect in their reasoning abilities, are generally incompetent at recognizing bad ideas, often believing conflicting theories simultaneously, and are given to support any other crank who they feel is showing science is somehow fundamentally wrong. This is the “paranoid style”, it’s well-described, and likely, irreversible. However, more run-of-the-mill denialism should be preventable.

We’ve discussed this extensively in regards to research by Dan Kahan, although I have disagreed with this jargon of motivated reasoning. Chris, however, knows what they’re referring to with their fancified science-speak, ideology is at the root of denial.

Recognizing that the problem of anti-science is not one of a lack of information, or of education, or of framing is of paramount concern. This is a problem with humans. This is the way we think by default. People tend to arrive at their beliefs based on things like their upbringing, their religion, their politics, and other unreliable sources. When opinions are formed based on these deeply-held beliefs or heuristics, all information subsequently encountered is either used to reinforce this belief, or is ignored. This is why studies showing education doesn’t work, the more educated the partisan is on a topic, the more entrenched they become. You can’t inform or argue your way out of this problem, you have to fundamentally change the way people reason before they form these fixed beliefs.

Scientific reasoning and pragmatism is fundamentally unnatural and extremely difficult. Even scientists, when engaged in a particular nasty internal ideological conflict, have been known to deny the science. This is because when one’s ideology is challenged by the facts you are in essence creating an existential crisis. The facts become an assault on the person themselves, their deepest beliefs, and how they perceive and understand the world. What is done in this situation? Does the typical individual suck it up, and change, fundamentally, who they are as a person? Of course not! They invent a conspiracy theory as to why the facts have to be wrong. They cherry pick the evidence that supports them, believe any fake expert that espouses the same nonsense and will always demand more and more evidence, never being satisfied that their core beliefs might be wrong. This is where “motivated reasoning” comes from. It’s a defense of self from the onslaught of uncomfortable facts. Think of the creationist confronted with a fossil record, molecular biology, geology, physics, and half a dozen other scientific fields, are they ever convinced? No, because it’s all an atheist conspiracy to make them lose their religion.

How do we solve this problem?

First we have to recognize it for what it is, as Mooney and others have done here. The problem is one of human nature. Engaging in denialism doesn’t have to mean you’re a bad person, or even being purposefully deceptive (although there are those that have that trait), the comparison to holocaust denial, always a favorite straw man of the denialist, is not apt. Denialism in most people is a defense mechanism that protects their core values from being undermined by reality. And no matter what your ideology, at some point, you will have a conflict with the facts because no ideology perfectly describes or models all of reality. You are going to come into conflict with the facts at some point in your life no matter where you are on the ideological spectrum. The question is, what will you do when that conflict arises? Will you entrench behind a barrier of rhetoric, or will you accept that all of us are flawed, and our beliefs at best can only provide an approximation of reality – a handy guide but never an infallible one?

Second, we have to develop strategies towards preventing ideological reaction to science and recognize when people are reacting in an irrational fashion to an ideological conflict with science. One of my commenters pointed me to this paper, which describes an effective method to inoculate people against conspiratorial thinking. Basically, if you warn people ahead of time about conspiratorial craziness, they will be more likely to evaluate the claims of conspiracists with higher skepticism. We should encourage skeptical thinking from an early age, and specifically educate against conspiratorial thinking, which is a defective mode of thinking designed to convince others to act irrationally (and often hatefully). When we do see conspiracy, we shouldn’t dismiss it as harmless, the claims need to be debunked, and the purveyors of conspiracy theories opposed and mocked. Before anyone ever reads a line of Alex Jones, or Mike Adams, a training in skepticism could provide protection, and with time, the paranoid style will hold less and less sway. People primed to expect conspiratorial arguments will be resistant, and more skeptical in general. The Joneses, Moranos, and the Adamses of the world don’t have the answers, they know nothing, and their mode of thought isn’t just wrong, but actively poisonous against rational thought. As skeptical writers we should educate people in a way that protects them from their inevitable encounter with such crankery. This is why writers like Carl Sagan are so important with his (albeit incomplete) Baloney Detection Kit. He knew that you have to prepare people for their encounters with those with an ideological agenda, that others will bend the truth and deny the science for selfish reasons.

This is what is at the heart of true skepticism. First, understanding that you can be wrong, in fact you will often be wrong, and all you can do is follow the best evidence that you have. It’s not about rejecting all evidence, or inaction from the constantly-moved goalposts of the fake skeptics. It’s about pragmatism, thoughtfulness, and above all humility towards the fact that none of us has all the answers. Second, it’s understanding not all evidence is created equal. Judging evidence and arguments requires training and preparation as recognizing high-quality evidence and rational argument is not easy. In fact, most people are woefully under-prepared by their education to do things like read and evaluate scientific papers or even to just judge scientific claims from media sources.

Thus I propose a new tactic. Let’s get Carl Sagan’s Baloney detection kit in every child’s hands by the time they’re ten. Hell, it should be part of the elementary school curriculum. Lets hand out books on skepticism like the Gideons hand out Bibles. Let’s inoculate people against the bullshit they’ll invariably contract by the time they’re adults. We can even do tests to see what type of skeptical inoculation works best at protecting people from anti-science. It’s a way forward to make some progress against the paranoid style, and the nonsense beliefs purveyed by all ideological extremes. There is no simple cure, but we can inoculate the young, and maybe control the spread of the existing disease.

Tribalism, Cultural Cognition, Ideology, we're all talking about the same thing here

From Revkin I see yet another attempt to misunderstand the problem of communicating science vs anti-science.

The author, Dan Kahan, summarizes his explanation for the science communication problem, as well as 4 other “not so good” explanations in this slide:
Kahan slide

He then describes “Identity-protective cognition” thus:

Identity-protective cognition (a species of motivated reasoning) reflects the tendency of individuals to form perceptions of fact that promote their connection to, and standing in, important groups.

There are lots of instances of this. Consider sports fans who genuinely see contentious officiating calls as correct or incorrect depending on whether those calls go for or against their favorite team.

The cultural cognition thesis posits that many contested issues of risk—from climate change to nuclear power, from gun control to the HPV vaccine—involve this same dynamic. The “teams,” in this setting, are the groups that subscribe to one or another of the cultural worldviews associated with “hierarchy-egalitarianism” and “individualism-communitarianism.”

CCP has performed many studies to test this hypothesis. In one, we examined perceptions of scientific consensus. Like fans who see the disputed calls of a referree as correct depending on whether they favor their team or its opponent, the subjects in our study perceived scientists as credible experts depending on whether the scientists’conclusions supported the position favored by members of the subjects’ cultural group or the one favored by the members of a rival one on climate change, nuclear power, and gun control.

 

Does anyone else think that maybe they’re unnecessarily complicating this? First, denialism is not an explanation for the science communication problem. It is a description of tactics used by those promoting bogus theories. Denialism is the symptom, ideology is the cause, and what we consider ideology seems more or less synonymous with this “identity-protective cognition”, while being less of a mouthful.

Call it what you will, when you have ideology, or religion, or politics, or other deeply held beliefs which define your existence and your concept of “truth”, conflicts with this central belief are not just upsetting, they create an existential crisis. When science conflicts with your ideology, it conflicts with who you are as person, how you believe you should live your life, what you’ve been raised to believe. And, almost no matter what ideology you subscribe to, eventually science will come in conflict with it, because no ideology, religion, or political philosophy is perfect. Eventually, they will all jar with reality. And what do most people do when science creates such a conflict? Do they change who they are, fundamentally, as a person? Of course not. They just deny the science.

Denialism is the symptom of these conflicts, and this is where the problem with the term “anti-science” comes in. Most denialists and pseudoscientists aren’t against science as the term suggests. I think of “anti-science” as being in conflict with established, verifiable science, without good cause. But most people read it as being against science as some kind of belief system or philosophy, which it usually isn’t. And while some people do promote the “other ways of knowing” nonsense, for the most part, even among denialists, there is acceptance that the scientific method (which is all science is) is superior at determining what is real versus what is not real. That is why they are pseudoscientists. They try to make their arguments sound as if they are scientifically valid by cherry-picking papers from the literature, by using science jargon (even if they don’t understand it), or by pointing to fake experts that they think confer additional scientific strength to their arguments. They crave the validity that science confers on facts, and everyone craves scientific validation (or at least consistency) with their ideology or religious beliefs. It sucks when science conflicts with whatever nonsense you believe in because science is just so damn good at figuring stuff out, not to mention providing you with neat things like longer life expectancy, sterile surgery, computers, cell phones, satellites, and effective and fun pharmaceuticals. This is why (most) pseudoscientists and denialists insist that the science is really on their side, not that science isn’t real, or that it doesn’t work. We know it works, the evidence is all around us, you are using a computer, after all, to read this. Anti-science as a term is too-frequently misunderstood, or inaccurate.

Pseudoscientists and denialists don’t hate science, that’s not why they’re anti-science. They crave the validity that science confers, and want it to apply to their nonsense as well. Sadly, for about 99.9% of us, at some point, science will likely conflict with something we really, really want to be true. What I hope to accomplish with this blog is to communicate what it looks like when people are so tested, and fail. And I suspect the majority of people fail, because in my experience almost everyone has at least one cranky belief, or bizarre political theory. Hopefully when people learn to recognize denialist arguments as fundamentally inferior, they will then be less likely to accept them, and when it’s their turn to be tested, hopefully they will do better.

Scientific American addresses denialism in politics – says it jeopardizes democracy

Scientific American evaluates the candidates on their answers to Sciencedebate 2012 and evaluates ideology-based denialism as a whole:

Today’s denial of inconvenient science comes from partisans on both ends of the political spectrum. Science denialism among Democrats tends to be motivated by unsupported suspicions of hidden dangers to health and the environment. Common examples include the belief that cell phones cause brain cancer (high school physics shows why this is impossible) or that vaccines cause autism (science has shown no link whatsoever). Republican science denialism tends to be motivated by antiregulatory fervor and fundamentalist concerns over control of the reproductive cycle. Examples are the conviction that global warming is a hoax (billions of measurements show it is a fact) or that we should “teach the controversy” to schoolchildren over whether life on the planet was shaped by evolution over millions of years or an intelligent designer over thousands of years (scientists agree evolution is real). Of these two forms of science denialism, the Republican version is more dangerous because the party has taken to attacking the validity of science itself as a basis for public policy when science disagrees with its ideology.

I agree. We’ve debated on this site the prevalence of denialism on the left vs. the right, but I think it’s a distraction from the central point which I think is being argued most effectively by Jonathan Haidt. That is, humans are not rational beings and most uses of reason are to rationalize positions that we arrived at by intuitive means. That means all ideology is going to strain your relationship with science. Humans tend to hold positions based on shortcuts, or heuristics, that lead them to what feels right, then they use reason to dig in to those positions. It is extremely difficult, and uncommon, for people to change their minds based on reason and evidence. So, any time you have political ideology as the source of people’s positions, you will encounter anti-science when those ideologies conflict with the science. Just like right-wingers have a big problem with climate change and evolution, left-wingers have a big problem with a kind of food religion, GMO and toxin paranoia, and other health and environmental denialism. I think the author here, Shawn Otto, has it exactly right.

His argument to tie the problem into encroaching authoritarianism might be more of a stretch:

By falsely equating knowledge with opinion, postmodernists and antiscience conservatives alike collapse our thinking back to a pre-Enlightenment era, leaving no common basis for public policy. Public discourse is reduced to endless warring opinions, none seen as more valid than another. Policy is determined by the loudest voices, reducing us to a world in which might makes right—the classic definition of authoritarianism.

I don’t know if authoritarianism is the destiny of a population that rejects science. Surely we are at greater risk of manipulation by those that control the message most effectively. More likely, we would be easily manipulated into supporting an oligarchy or plutocracy of those at the top of society who can manage media and politicians through money and influence, or at worst we might get a kakistocracy if the likes of the tea party come to power. Otto is right, however, when empiricism and facts are no longer important, the likelihood that the unqualified, the unprincipled, and the ignorant coming to power will increase.