10-28-08, 10:42 am
Public opinion polls are deliberately designed to NOT reveal what the American people are really thinking, writes David W. Moore in his recent book The Opinion Makers.
In this book, Moore points out that people at times often hold opinions on some matters that are superficial and at other times view other problems and issues thoughtfully and deeply. Pollsters go out of their way to smooth over this complex reality because media clients want clear cut expressions of opinions.
People are also often unclear about the complexities of the issues they are asked about, so pollsters fill them in order to get a definite answer. The probelm, however, this makes respondents no longer a representative sample. Sometimes pollsters do ask if people have heard about the issue, other times they don't – depending on the issue and the kind of responses they want. 'That's' Moore says, 'a deliberately manipulative tactic that cannot help but undercut pollsters' claims of scientific objectivity.'
When asking for opinions pollsters should always have a question that asks if the respondent knows or cares about the issue. Understanding the level of the public's understanding or knowledge about an issue is just as important as knowing what it thinks and 'suppressing it for commercial or other purposes is simply unacceptable.'
Moore also says a question should be asked about the 'intensity' of the opinion. Pollsters should also stop supplying information to the respondents as that makes the poll 'hypothetical' rather than an actual reflection of what people are thinking.
The following rule should be applied. Any poll that does not reveal that at least 20 percent of the respondents are 'disengaged' has probably been manipulated. The poll 'should be viewed with deep suspicion.'
Another thing to be wary of, according to Moore, is a device called the 'national electorate.' During primary season most polls take a nationwide survey and try to predict the primaries on that basis. This is why they are so often off course. It is too expensive to take state by state polls so the cheaper, and less accurate, 'national electorate' is polled instead. If it can't be gotten rid of then at least, after asking 'If the election were held today who would you vote for?' add a question about the degree of support for the respondent's choice – i.e., definitely would vote for, leaning towards but might change, have not really decided, etc.
In a section of the book called 'Fuzzy Opinion' we learn that wording can determine the outcome of a poll. For example, if you ask a question about the government's wanting to ban some action and use the term 'not allow' instead of 'forbid' more people will say they agree with the government. More people will agree with programs labeled as 'assistance to the poor' that if the term 'welfare' is used. More people will support 'gay and lesbian relations' than 'homosexual relations.' So pollsters know how to get the results they want once they figure which buzz words to use or to avoid.
Even the order of the questions can make a poll fuzzy. Given a choice between two answers most people choose the second to the first. The order of questions is also important with multiple questions. Moore gives the example of Bill Clinton getting a better rating when he was listed after Al Gore rather than before.
Moore concludes that 'any measure of public opinion is at best a rough approximation of what people are thinking.' The margin of error is only one of many ways polls can be misleading. He ends his book by saying the polls could be a better reflection of reality if they would only honestly try to measure the 'extent of public disengagement' and not publish 'false results to conceal public ignorance and apathy.' However, there is no evidence that any of the major media polls are willing to do this. He hopes that their many contradictions will eventually shame them into being more honest with the public. As of now, they are doing a disservice to the democratic process.
--Thomass Riggins is associate editor of Political Affairs.