As measured in a series of polls, support for Barack Obama among likely voters, both in terms of the national vote and in swing states, has increased the past couple of weeks. (Though two new polls todaymay offer evidence that the trend is now diminishing nationally.)
As these polls have trickled out, a certain talking point among conservative media has emerged, namely that the polls are “skewed” in favor of Obama voters. The criticism mostly turns on the notion that pollsters are using an “incorrect voter turnout model,” one based on the huge surge of participants in 2008 as opposed to the number of people voting in a more routine presidential election. In fact, a Virginia Republican has even launched a web site called UnSkewed Polls, dedicated to “correcting” the incorrect turnout assumptions in each survey.
Here’s just one example of the incorrect-turnout-model critique, from Fox News contributor Douglas Schoen in an article called “The truth about 2012 polls”
The assumption the pollsters are making that turnout in 2008 will be the same and even better for Obama than in 2008 is flawed. Not only are we looking at a terrible economic situation, but there will be key differences in turnout from 2008 that will affect the results and the accuracy of these polls.
Democratic registration may be overstated. It is my belief that many weak Obama voters are saying that they are Democrats when they really aren’t partisans at all: they are disillusioned with American politics. What this means is that these people aren’t even certain to vote in November and if they stay home, Obama’s numbers will surely be affected.
Additionally, fewer young people will turn out at the polls this year. As evidenced by Obama’s push to mobilize the youth vote, a group that he won handily in 2008, demonstrates, this is a key group that is becoming increasingly apathetic and is apt to turn out in fewer numbers. Full article
So… is there any validity to this criticism?
We turned to USF lecturer David Latterman for guidance. Latterman has done lots of professional polling and consulting in San Francisco and Northern California, and has also taught campaign management. Latterman told me that while polling is an art, it’s also “not hard to get it right.” Here’s my interview with him, edited for length and clarity. Keep in mind his political work has been for Democratic candidates.
JON BROOKS: What do you think of the “inaccurate turnout projection” criticism of the polls by Republicans?
POLLSTER AND POLITICAL CONSULTANT DAVID LATTERMAN: That is always an important critique. If you’re going to poll, you need to approximate what the turnout is going to be for any given election, because each election is different and polls are only as good as far as they’re a representative of likely-to-vote voters.
So you have to ask: Are the demographics right? Are you using a proper cell phone sample? Those are legit questions and anyone is correct to raise the issue in a general sense.
However, the problem with that critique on these Obama-Romney polls is that basically every poll out there over the last couple of weeks says roughly the same thing. So you’re starting to see convergence on an awful lot of polls. And that’s what you’re looking for. If you’ve got several reputable polling operations coming to the same conclusion, giving you pretty much the same numbers within any reasonable margin of error, they’re not getting their turnout models wrong collectively.
“You’re starting to see convergence on an awful lot of polls. And that’s what you’re looking for.”
There are different ways by which you can gauge likelihood to vote. You can use past history, you can ask them a series of filter questions — every operation has its own way of doing it.
Now it has happened before that a mass of polls were wrong. New Hampshire 2008 was a collective screw-up. But it doesn’t happen much. And for what it’s worth, Fox News is coming up with the same numbers as everybody else is.
So while consistency doesn’t mean everyone is doing it right, it’s something to look at.
JON BROOKS: So if I understand you correctly, you’re saying professional pollsters would never assume that turnout would be the same as in 2008, because it was extraordinary that year.
DAVID LATTERMAN Oh good Lord know. I’m working races right now in San Francisco and nobody thinks that. There’s no way. You have to figure that out early on; you have to know your universe.
In California we’ve studied the “surge voter” – the voter who turned up for the first time in his or her life to vote for Obama. A lot of minority voters, younger voters – we didn’t see them vote for governor two years later and we don’t expect to see them again. So we not only have to think of 2008 vs. 2012, we have to think about specifically which people aren’t going to show up again.
You also have to think about several states where likely Obama voters are not going to be able to vote because of the new voter-ID laws. It’s not a huge number of people who will be affected, but it’s enough to matter. Whether a NY Times pollster is doing this I don’t know, but I guarantee you Obama’s people are doing it.
JON BROOKS: Are there any polls you trust more than others?
DAVID LATTERMAN: The Quinnipiac stuff is usually pretty valid. And with the big outfits, if they’re wrong they’re just statistically wrong, and they’re not wrong by much. They have a lot at stake to not be wrong. If they’re wrong, everybody sees it.
Latterman and I spoke on Friday. Over the weekend, the Big Daddy of all poll analysts, Nate Silver of the New York Times’ 538 blog, addressed the issue of alleged poll bias. His analysis dealt with the charge that polls were “oversampling” Democratic voters to yield results more favorable for Obama. Here’s the crux of what he said:
The criticisms [of bias] are largely unsound, especially when couched in terms like “oversampling,” which implies that pollsters are deliberately rigging their samples.
Pollsters will re-weight their numbers if the demographics of their sample diverge from Census Bureau data. For instance, it is typically more challenging to get younger voters on the phone, so most pollsters weight their samples by age to remedy this problem.
But party identification is not a hard-and-fast demographic characteristic like race, age or gender. Instead, it can change in reaction to news and political events from the party conventions to the Sept. 11 attacks. Since changes in public opinion are precisely what polls are trying to measure, it would defeat the purpose of conducting a survey if pollsters insisted that they knew what it was ahead of time.
Silver went on to elucidate on whether the polls have a “history of being biased toward one party or the other.” He writes:
The polls have no such history of partisan bias, at least not on a consistent basis. There have been years, like 1980 and 1994, when the polls did underestimate the standing of Republicans. But there have been others, like 2000 and 2006, when they underestimated the standing of Democrats. Full post
By the way, here’s that other go-to political analyst, Stephen Colbert, on the polling bias issue….