Using Polls to Mislead the Public

10-16-08, 9:24 am



In the fifth chapter of his book, The Opinion Makers, David W. Moore explains why so many polls contradict each other and generally misrepresent what people actually think about major policy issues. One of the problems is that many policy issues are both arcane and complex and many, if not most, people are not following the issue and basically don't really know what to think about it. This is a fact, Moore says, 'that media pollsters generally do everything in their power to conceal. Rather than allow respondents to freely acknowledge they don't have an opinion, pollsters pressure them to choose one of the available options.'

One of the tricks of the trade in polling is that vastly different results can be obtained by how the questions in the poll are designed. This is especially the case, Moore points out, when people are not well informed about the issue and forced choice questions are presented to them. An example is the the polling done by Frank Luntz, a Republican pollster working for Arctic Power, a group which favors drilling in the Arctic National Wildlife Refuge. He reported that drilling was favored 51 percent to 34 percent. This was a month after a poll by John Zogby for the Wilderness Society, an anti-drilling group, reported that drilling was opposed 55 percent to 38 percent. Zogby presented the issue as an environmental one, whereas Luntz presented it as an issue of energy independence.

Another example. In 2003 an ABC/Washington Post poll found that Americans opposed the US sending troops to Liberia 51 percent to 41 percent while CNN/USA Today/Gallup poll found that they approved sending the troops 57 percent to 36 percent. Moore quotes the editor-in-chief at Gallup as saying: 'Opinions about sending in US troops are therefore very dependent on how the case is made in the questions and what elements of the situation there are stressed to them [the respondents]. The two polls essentially manufactured their respective results 'neither of which,' Moore concludes, 'told the truth about how little people knew and how unengaged they were from the issue.'

The next example is regarding the State Children's Health Insurance Program (SCHIP), whereby the federal government helps the state pay for children's insurance (for families not poor enough for Medicaid nor rich enough to buy their own). Polls were taken in 2007 to see if the public supported this program. CNN said 61 percent supported SCHIP, CBS found 81 percent, ABC/WP found 72 percent, but Gallup found 52 percent OPPOSED. Why this great disparity? It was 'because each poll fed its respondents selected information, which the general public did not have. The real public, where only half of Americans knew anything about the program, wasn't represented in these polls at all.'

One last poll. In 2006 Americans were asked 'Should it be more difficult to obtain an abortion in this country?' Pew asked twice and got two diverging answers. On the first survey, 66 percent of respondents said YES, abortions should be harder get, while Pew 2 only got 37 perrcent YES; Harris got 40 percent YES, and the CBS/NYT got 60 percent YES. It turns out that most Americans are not really informed about this issue and Moore says, 'forcing them to come up with an opinion in an interview meant that different polls ended up with contradictory results.'

So what can we conclude regarding media polls claiming to tell us what the American people think? Well, Moore writes that, 'By manipulating Americans who are ill informed or unengaged in policy matters into giving pseudo opinions, pollsters create an illusory public opinion that is hardly a reflection of reality.' In other words, most opinion polls on public policy are junk.

--Thomas Riggins is the associate editor of Political Affairs.