Monday, November 15, 2010

SurveyUSA polls the Governor's race again

SurveyUSA (11/10, 10/28 in parentheses):
Mark Dayton (D): 45 (39)
Tom Emmer (R): 44 (38)
Tom Horner (I): 9 (13)
Undecided: 1 (6)
(MoE: ±3%)

KSTP decided that it would be fun to resurvey the Minnesota Governor's race, you know, now that we know the outcome and everything.

What they found was confirmation of what happened and although we have actual ballots to tell us Mark Dayton won, the ballots don't tell us about some of the underlying numbers that this poll sheds a little light on.

Umm, that's not right

David Brauer of MinnPost already pointed out the most glaring aspect of the poll:

The 1,400-person poll's very first question: "Did you vote in Tuesday's election for Minnesota's governor?" 81 percent said yes.

The actual figure: 58 percent.

That's a 23 point miss, no small amount of error, leading Brauer to believe that more than a few Minnesotans were less than honest about their civic mindedness.

But given SUSA's general accuracy this cycle, a simpler scenario is that people fibbed. Researchers call it a "socially desirable response" — you're likelier to tell a stranger (or, in this robo-poll, a stranger's recorded voice) that you did your civic duty.

I'm not so sure this is what is going on.

For one thing, Nate Silver has found evidence that the "socially desirable response" effect is diminished when the poll is conducted by an automated pollster.

Automated polls have sometimes shown relatively lower levels support for gay marriage initiatives, for instance, in states like Maine and California. Homophobia is fairly common, but has become socially undesirable; the purveyors of the automated polls have sometimes claimed that their respondents are free to be more honest when there's not another human being on the line.

If this is true then voters shouldn't have as much trouble admitting to an automated script that they didn't vote. This means there is another factor at work here and the most likely culprit is non-response bias.

In short, the people who actually pick up the phone for the pollsters call, stay on the phone once they realize it's a pollster calling and subsequently complete the survey are more likely to be people who also voted.

If you're not willing to go vote, you're also probably not willing to talk to a pollster about how you didn't go vote. On the other hand, if you did go vote you're more likely to want to talk to a pollster about voting.

This kind of non-response bias is actually helpful to pollsters when conducting polls before an election. That's because who a pollster considers a likely voter is a big part of how accurate a poll will be and this non-response bias helps to sift out those who are unlikely to vote.

But when doing this type of post election survey, non-response bias can lead to weird results like this. It's hard to say what exactly is driving the 23 point gap between the poll and reality, but I don't think it's because 23% of Minnesotan's are lying liars, although I don't doubt that some are.

That said, for that sake of this analysis I'm going to assume that the underlying numbers are valid, but feel free to take it with a grain of salt if not a shaker.

Assuming Minnesotan's are not lying liars

Tom Emmer lost the election in June

The last question that SUSA asked in the survey was this one:

What do you think is the best way to balance the state budget? Raise taxes? Cut spending? Or both?

Raise Taxes: 6
Cut Spending: 59
Both: 32
Not Sure: 3
(MoE: ±2.6%)

If we add up all the responses except "Cut spending," we get 41, which is less than the amount of people who said they voted for Mark Dayton. Put another way, 59% of respondents said they supported Tom Emmer's position on how to balance the budget, but only 44% of those people said they voted for him.

Apparently 15% of respondents supported Tom Emmers position on the budget, his most clear and consistent message, and yet didn't vote for him.

It's apparent that Tom Emmer's image never recovered from the hit it took early on with the tip credit fiasco and other such gaffes. Even though he consistently pushed the "cut spending" message, people weren't able to get past their initial negative feelings about him.

The enthusiasm gap

The partisan numbers of those who voted sheds a little light on why Democrats lost the Minnesota House and Senate.

Group: Those who voted/Didn't vote
All Voters: 81/19
Democrat: 77/22
Republican: 91/9
Indy: 80/20

As we already knew, Republicans turned out in greater numbers than Democrats and Independents, and that appears to have been the primary difference in a number of close races.

Who did Horner hurt?

Of course the big news out of this poll, is that Horner took more votes from Mark Dayton than from Tom Emmer. KSTP featured this aspect of the poll in their headline:

KSTP/SurveyUSA Poll: Horner Drew More Votes From Dayton

That headline is another thing that Brauer takes issue with and he should.

This particular question in the poll, which was asked of those who said they voted for Horner has an astronomical 9.9% margin of error. This is because those who said they voted for Horner only make up 9% of the sample.

The poll shows those Horner voters breaking for Mark Dayton 37-29, but that is an eight point margin, falling within the question's margin of error, meaning you should be careful drawing any sort of sweeping conclusions from it, like the above headline.

Choice of Headlines

David Brauer is right to question KSTP's choice of headline, unfortunately the article he uses to pose this question has headline problems of it's own.

"Did Minnesotan's lie to KSTP's pollster?" is no better than "Horner Drew More Votes From Dayton" because both are specious claims. And yes, I can see the question mark that Brauer put in his headline, but the content of the article purports to provide the answer to the question posed while I don't believe that it does. Not the correct answer at least.


A poll like this is not really useful for the topline numbers it provides, we had an election for that. Rather, if the topline numbers roughly match the real numbers, as they do in this case, the underlying data will provide some useful information.

The crosstabs on the topline numbers, for instance, will offer insights that can't be gleaned from the ballots alone. Even the questions asked of Horner voters, at a 9.9% MoE can provide some useful info if put into the proper perspective.

The usefulness of this poll is that it helps to confirm some of the speculation as to what happened on election day. Republicans turned out in droves, and despite that Tom Emmer couldn't recover from his awful introduction to Minnesota voters even though his message resonated with people.

No comments:

Post a Comment