Because of their polling, Mitt Romney, his family and campaign people were confident of victory and then literally stunned and shell-shocked as the election night results rolled in and they lost to Baarak Obama.
This link explains why they were so wrong. What they really needed was this Prof’s Primer on Polling, especially this editor’s checklist for evaluating polling, or any research.
Americans are justifiably suspicious of polling, and not just for political reasons. Some of the fault lies with newspapers and broadcast which have reported off-beat “research” to try to snag audience. Yesterday I referred to USA Today once carrying a Women’s Day’s poll of its readers wouldn’t marry the same man is just one example. The poll was loaded with faults, and the media failed to point them out. Then Fox broadcast picked up the myth in September and continued with it. Carl Rove's meltdown on Fox on calling Ohio for Obama on election night is typical of the consequences of ignorance. Be suspicious, but understand the facts.
So here is this editor’s checklist for evaluating polls. I think this should also be every American’s checklist too.
- Who sponsored the polling? Is there a conflict of interest? Political polling is especially suspect if it’s loaded with distrust for the other side. Beware any polling done by a PR firm for a client. I understand the Republicans with their distrust of the so-called “liberal media” wanting to conduct their own polls. But they fell victim to the same faults they suspected from the other side.
- Who was included in the polling? If you poll certain age groups or geographic areas or don’t reach likely voters, you’re results will not be valid. See number three.
- How were the people chosen? Was it a true “random” sample? If not the results will be skewed (as with Romney’s polling).
- How many people were in the sample? If the poll has fewer than 1,100 respondents, your margin of error is going to be more than three percent.
- What was the response rate? If your sample was 1,000 and only 400 answered, you get results like the Women’s Day article which threw out more than it counted. Don’t seriously consider any response rate less than 60 percent.
- How accurate are the results? Always compute the margin of error—it results ware within it, then “it’s too close to call.”
- Who were the interviewers? Were they professionally trained and neutral? If not, prejudice, inflection of the voice and other factors can affect results.
- How was the polling conducted? Robo-call? In person? If it was a call in or send in, the results are worthless because they’re not random.
- When was the poll conducted? Results can change overnight.
- What were the actual questions asked? Wording can influence results, and can lead to opposite results on the same matter, depending on wording.
- Are the results cause and effect or just correlation? I saw a headlined story once: “Want to live a longer life? Marry yourself a younger wife.” The poll found that men with younger wives lived longer. But—this might not be cause and effect because there are many other factors involved—health, wealth, etc.
- Does the headline match the polling results?
If you’re an American and can’t answer these questions satisfactorily, you shouldn’t believe the results. If your newspaper or broadcast or online source doesn't answer the questions for you, you should be very skeptical.
A few of the faults of the Women’s Day poll and article: 1. It’s not random for all women in America, only subscribers. 2. Women’s Day has more than one million subscribers. About 100,000 responded. 3. The results were clipped out of the magazine and sent in (pre-Internet). 4. Once the magazine got the results, some of them were not counted.
So, a responsible newspaper or news outlet reporting polling results should include an explanatory item with every story. It should be written along these lines:
“The Daily Geezer poll of 1,200 registered voters in Geezer County was conducted Oct. 31, 2012, and asked two questions: ‘Will you vote Nov. 6?’ and ‘Which presidential candidate will you vote for?’ The results have a margin of error of plus and minus three percent."
A final note: Newspapers, broadcast and other news sources have an obligation to explain this to you. If they don't, they're not being responsible. If they report frivolous poll results to get your attention, it should tell you they're a frivolous news source.