Asking the Right Questions

Writing a poll isn’t as easy as it sounds. One of the biggest factors in dealing with respondents is that they can lie, or they can tell you what they think you want to hear. This measure is particularly important when trying to figure out the likely voter model. So, what do you do with a poll when its entire purpose is to determine who will turn out to vote? You have to really dig down and try to ask the best questions possible.

Yesterday, SCI released a poll saying that “nine in ten sportsmen and women are ‘very’ likely to vote in the upcoming mid-term elections.” My first question was how they determined a likely voter. When I finally saw the question, I was a little skeptical. I wasn’t so eager to raise questions to go downstairs and dig out the textbooks from my polling class in college, but this morning a relevant post just happened to cross my path courtesy of Jim Geraghty. And you know how I am about stirring the pot.

The first two questions in SCI’s poll ask whether the respondent is registered and how they are registered to vote. It’s the third question they appear to use to determine a likely voter: “And how likely is it that you will vote in the upcoming November election for Congress?” The best answer – “very interested” – garnered 88% of responses, with “somewhat likely” giving another 10%. That means 98% are “likely” voters by their measure. Anytime a number is that high, it’s not believable at all. Geraghty’s link today pointed out that defining likely voters with this method of questioning is very unreliable in a year like this:

The most difficult job a pollster has is trying to figure out who the actual voters are going to be in a given election year. This is easier said than done, because we know that (a) almost all survey participants say they will vote in the midterm election and (b) historically, only about 40 percent will.

Pollsters do their best to solve this problem by screening out those who are unlikely to vote using a question or series of questions probing interest in the election and/or prior voting behavior. These techniques vary widely from pollster to pollster. Some pollsters use especially “loose” voter screens: asking only, for example, if someone is certain to vote, without probing any deeper.

For example, simply asking respondents if they are certain to vote (used by Suffolk) will sometimes let more than 90 percent of respondents through a screen. In such a situation, nearly half of the respondents who are counted will not actually vote.

The article does note that even when you use tighter screens, you’ll still get people through who won’t actually vote. No poll is perfect, but I do believe it’s worth it to at least try and weed out some of those folks who don’t participate just to get a more accurate picture.

To SCI’s credit, their pollster did try to measure enthusiasm. It was very high, but then again, the survey response was pretty tilted toward Republicans which would likely reflect the higher-than-normal interest in the elections. But, their measure of enthusiasm should be a sign that the 98% number is way off. Respondents were asked to rate their interest in the elections on a scale of 1 to 10, and 23% rated their interest as 5 or less. I would say that interest is almost certainly a worthy measure to consider in whether someone is likely to vote – and that brings us down to less than 80% of potential likely voters. Many polls also ask whether the person has a history of voting in recent elections, which is usually a pretty decent indicator of future behavior. Unfortunately, the SCI poll didn’t go into this background with the folks they called. The more questions you ask along these lines, the more liars you weed out.

Before anyone says I’m just getting nit picky, I think it’s important to consider why we need to go the extra mile to get the right information. Is a publicly-released poll touting 9 out of 10 of sportsmen vote more valuable than one kept internally that shows only 7 in 10 will likely vote? If all you’re after is a quick headline for the movement, a quick dose of patriotism, and maybe some numbers to casually throw in front of a politician, then it probably is better to forgo the expense of adding extra questions to the poll that would really determine your true likely voters. However, if you want the poll to be used in a way to drive turn out machines, move resources in the right direction, or formulate a plan to engage more people, it’s better to have the most accurate information. Personally, I’m more interested in results, so I’ll go with the latter option. It still shows that sportsmen vote at higher rates than the average voter, so it does us no harm. However, it also may show us how we can improve our outreach so the 9 in 10 statistic is actually reflected on Election Day.

5 thoughts on “Asking the Right Questions”

  1. I remember going over this kind of thing in my statistics and sociology classes in college. It’s the reason that I am suspicous of a great many polls. It is way to easy to intentionally get the results you want by wording questions and/or selecting participants in certain ways.

  2. The reason they want to identify likely voters is that everyone in campaigns is always looking for the Holy Grail: a turnout predictor. Political professionals with a hint of real knowledge all know you can’t predict turnout, period. It’s the one variable that there’s no formula for. Perhaps one will be discovered someday, but so far it’s out of everyone’s control and that drives campaign managers and candidates nuts.

  3. Tim, I don’t think they were intentionally going for a crazy high number or anything. Like the quote from RealClearPolitics says, other pollsters set the bar that low as well. It’s just a matter of whether they should be set that low.

    Peter, I actually think among sportsmen, we’d still get a pretty high number of likely voters even if they figured out the perfect predictor. I get that impression based on the interaction I have with folks in the community – gun shows, clubs, etc. There seems to be far more general political/civic awareness than other groups of average folks I know who aren’t sportsmen, such as high school friends, and the like. That said, I still think the 88%-98% that this poll would indicate is a tad on the high side – though in line with what the numbers seem to be for Suffolk and their low barrier questions.

  4. It might be more accurate to say “98% of the people who answered these questions… .” I suspect the people who are willing to respond to a poll will also be willing to vote. And, conversely, those who will not vote will probably tell a pollster to FOAD.

    1. Actually, no. You’d be amazed at how many people are willing to lie to your face about even being registered to vote. I’ve caught people at it when I work at gun shows. I try to keep it friendly and just say something along the lines of, “Look, why don’t we get you registered and then you don’t have to do a thing until Election Day.” Of course, they are still unlikely to show up, but I just consider that the next hurdle to jump once they are registered.

Comments are closed.