I read a book last fall that was simultaneously enthralling and depressing. The book was called “The Myth of the Rational Voter” by Bryan Caplan and was published in 2007. The central argument of the book is that voters are neither well informed nor rational, and not because they are bad people but because there are no incentives to be either informed or rational.
What do I mean when I say there are no incentives to be informed when it comes to voting? An incentive is something that motivates action. What motivates a person to vote? One plausible-sounding answer is that the person expects to gain from having one candidate win rather than another, and votes for that candidate in an effort to make that a reality. Voters will then gather information about the candidates to make sure they choose the one who benefits them.
This seems like a reasonable explanation for why people vote but it doesn’t hold up under scrutiny. The fact is that the gains from voting are far too small to justify spending any time at all collecting information about the candidates. Imagine that you stand to gain $10,000 if Candidate A wins instead of Candidate B. In a national election with 100 million voters, your vote has a 1 in 10 million chance of affecting the outcome. What can you expect to gain by voting? Even though the financial return to you is high if Candidate A wins, the probability of your vote affecting the outcome is so low that your expected return to voting is a meager one-tenth of one cent!
Simple arithmetic demonstrates that voters have almost nothing to gain (and nothing to lose) by voting for one candidate over another, or even voting at all. If people care about maximizing their earnings, we would not expect them to spend copious amounts of time researching who is the best candidate because discovering the answer to that question is irrelevant. In other words, voters and non-voters can afford to be ignorant of politics because there is no pay off to becoming educated.
The data on the public’s knowledge of government bear this out. Specifically, Caplan cites the Kaiser Family Foundation’s nation wide poll of Americans about their knowledge of the federal government’s budget in 1995. The respondents were given a list of six budget items (foreign aid, welfare, interest on the federal debt, defense, Social Security, and health) and then asked to name the two largest budgetary items. Forty one percent of respondents named foreign aid as one of the top two programs, even though it only receives 1 percent of the budget. Only 14 percent correctly guessed that Social Security was one of the top two budget items, which accounts for 21 percent of the budget (making it tied with defense for the most expensive federal program in 1995).
People don’t have the incentives to find accurate information about what the government does, so it’s not surprising that their beliefs can stray so far from the facts. But perhaps this has something to do with the difficulty of finding reliable information. Surely, if the public had easy access to the unadulterated truth, their views would conform more closely to reality. Right?
A friend of mine from college named David Faden pointed me to an academic paper that sheds some light on this issue. The paper was written by political scientists Jason Reifler and Brendan Nyhan, who conducted an experiment in 2005 to test how easily a person’s misperceptions about political issues could be corrected.
Reifler and Nyhan had roughly 130 subjects read a speech by George W. Bush from 2004 in which he claimed there was a risk Saddam Hussein possessed weapons of mass destruction (WMD). Half of the participants read a news article that mentions the Iraq Survey Group’s finding that it had not found evidence Iraq possessed WMD in 2003. That is the group that received the “correction.” The other half did not read that information. The researchers then asked the subjects how strongly they agreed with the statement that Iraq possessed WMD in 2003. What Reifler and Nyhan found was that the correction changed misperceptions, but only among self-identified liberals. Interestingly, conservatives who read the correction believed more strongly that Iraq had WMD than conservatives who did not read the correction.
In a later study, Reifler and Nyhan then tested subjects’ views on whether or not George W. Bush had banned stem cell research. The participants read a news story from 2004 in which John Kerry accused Bush of having banned stem cell research. Once again, half of the participants were given a news story with an additional paragraph, this time stating that Bush’s policy did not limit privately funded stem cell research. Just as in the first experiment, the correction worked more strongly with one ideological group than another. Here, conservatives who read the correction were less likely to believe Bush had banned stem cell research than conservatives who did not, but the correction had no effect on liberals.
After reading this study, I was even more disillusioned about the prospect of a functioning democratic government than I was after reading Caplan’s book. It’s bad enough to think that people have wildly mistaken beliefs about politics. But to know that people won’t change their mind even after their beliefs are shown to be false? I think that means we need to start looking for better ways of making policies than by voting. I plan to expound on some of those ways in future posts.