The Closing of the Political Mind
Ezra Klein assesses the damage on Vox.com.
If you want to feel depressed about the future of American politics, Obamacare confirms an unnerving phenomenon that has been well-documented by social scientists: more and better information has almost no effect on the political mind.
There’s a simple theory underlying much of American politics. It sits hopefully at the base of almost every speech, every op-ed, every article, and every panel discussion. It’s taught in civics classes and laced through TED talks. It’s what we might call the More Information Hypothesis: the belief that our most bitter political battles are mere misunderstandings.
The cause of these misunderstandings? Too little information — be it about climate change, taxes, Iraq, or Obamacare. If only the citizenry were more informed, the thinking goes, then there wouldn’t be all this fighting. Social scientists have tested this hypothesis again and again. It turns out to be not just wrong but literally backward. The more information partisans get, the deeper their disagreements become. When it comes to politics, people reason backward from their conclusions.
Yale law professor Dan Kahan has built something of a cottage industry out of these experiments. In one study, he tested people’s scientific literacy alongside their ideology and then asked about the risks posed by climate change. If the problem was truly that people needed to know more about science to fully appreciate the dangers of a warming climate, then their concern should’ve risen alongside their knowledge. But the opposite was true: among people who were already skeptical of climate change, scientific literacy made them more skeptical of climate change.
Politics, it turns out, makes smart people stupid.
Obamacare has been a real-time test of these theories. Much has happened to the law over the past five years. But opinions have barely budged. It’s easy to poll voters and lament how little they know, or have learned, about the law. But in Washington, where the information is both plentiful and regularly used, the hardening of opinions is, if anything, even worse. No congressional Democrats have watched Obamacare’s progression and turned against the law. No congressional Republicans have noticed the law covering tens of millions of people with cheaper-than-expected premiums and decided maybe it’s not such a disaster after all.
I recently had a sadly confirmatory experience in this vein. I am a big supporter of my college’s athletic teams and participate avidly in an on-line community of similarly enthusiastic fans. There is a politics board as part of this site that I occasionally post on to try out new ideas or sharpen debate skills. I got into a discussion of LGBT folks as parents with a fellow whose sports takes I have admired for their perceptiveness and lucidity. I think he either is or has been a coach. I knew he was on the other side politically but hoped because of his obvious intelligence we could at least have some productive dialogue.
I must plead guilty to the TED talk assumption: if I can marshall facts and carefully reasoned arguments then that will carry the day. Those facts and reasoned arguments proved to be gasoline on the fire. Further, his arguments betrayed some very basic, fundamental errors in interpreting and making meaning of scientific evidence. While I feel sure he knows more about football than I do, I feel just as certain that I know far more about research science. It became clear that he didn’t know what he didn’t know. As I gently began to expose and explain some of the beginner’s flaws in his arguments, he continued to escalate them with growing vehemence, declining coherence and hardening (if misplaced) certitude. Discouraged, I finally just let the matter drop.
Cross-posted from The Sensible Center