Writing a poll isn’t as easy as it sounds. One of the biggest factors in dealing with respondents is that they can lie, or they can tell you what they think you want to hear. This measure is particularly important when trying to figure out the likely voter model. So, what do you do with a poll when its entire purpose is to determine who will turn out to vote? You have to really dig down and try to ask the best questions possible.
Yesterday, SCI released a poll saying that “nine in ten sportsmen and women are ‘very’ likely to vote in the upcoming mid-term elections.” My first question was how they determined a likely voter. When I finally saw the question, I was a little skeptical. I wasn’t so eager to raise questions to go downstairs and dig out the textbooks from my polling class in college, but this morning a relevant post just happened to cross my path courtesy of Jim Geraghty. And you know how I am about stirring the pot.
The first two questions in SCI’s poll ask whether the respondent is registered and how they are registered to vote. It’s the third question they appear to use to determine a likely voter: “And how likely is it that you will vote in the upcoming November election for Congress?” The best answer – “very interested” – garnered 88% of responses, with “somewhat likely” giving another 10%. That means 98% are “likely” voters by their measure. Anytime a number is that high, it’s not believable at all. Geraghty’s link today pointed out that defining likely voters with this method of questioning is very unreliable in a year like this:
The most difficult job a pollster has is trying to figure out who the actual voters are going to be in a given election year. This is easier said than done, because we know that (a) almost all survey participants say they will vote in the midterm election and (b) historically, only about 40 percent will.
Pollsters do their best to solve this problem by screening out those who are unlikely to vote using a question or series of questions probing interest in the election and/or prior voting behavior. These techniques vary widely from pollster to pollster. Some pollsters use especially “loose” voter screens: asking only, for example, if someone is certain to vote, without probing any deeper.
For example, simply asking respondents if they are certain to vote (used by Suffolk) will sometimes let more than 90 percent of respondents through a screen. In such a situation, nearly half of the respondents who are counted will not actually vote.
The article does note that even when you use tighter screens, you’ll still get people through who won’t actually vote. No poll is perfect, but I do believe it’s worth it to at least try and weed out some of those folks who don’t participate just to get a more accurate picture.
To SCI’s credit, their pollster did try to measure enthusiasm. It was very high, but then again, the survey response was pretty tilted toward Republicans which would likely reflect the higher-than-normal interest in the elections. But, their measure of enthusiasm should be a sign that the 98% number is way off. Respondents were asked to rate their interest in the elections on a scale of 1 to 10, and 23% rated their interest as 5 or less. I would say that interest is almost certainly a worthy measure to consider in whether someone is likely to vote – and that brings us down to less than 80% of potential likely voters. Many polls also ask whether the person has a history of voting in recent elections, which is usually a pretty decent indicator of future behavior. Unfortunately, the SCI poll didn’t go into this background with the folks they called. The more questions you ask along these lines, the more liars you weed out.
Before anyone says I’m just getting nit picky, I think it’s important to consider why we need to go the extra mile to get the right information. Is a publicly-released poll touting 9 out of 10 of sportsmen vote more valuable than one kept internally that shows only 7 in 10 will likely vote? If all you’re after is a quick headline for the movement, a quick dose of patriotism, and maybe some numbers to casually throw in front of a politician, then it probably is better to forgo the expense of adding extra questions to the poll that would really determine your true likely voters. However, if you want the poll to be used in a way to drive turn out machines, move resources in the right direction, or formulate a plan to engage more people, it’s better to have the most accurate information. Personally, I’m more interested in results, so I’ll go with the latter option. It still shows that sportsmen vote at higher rates than the average voter, so it does us no harm. However, it also may show us how we can improve our outreach so the 9 in 10 statistic is actually reflected on Election Day.