Monday, September 29, 2008

Is There a “Good” Poll?

Looking at Real Clear Politics this morning, I noticed that there is still a range of opinion about the presidential race. Obama supporters will be glad to hear that Rasmussen Tracking, Hotline/FD Tracking, and Gallup Tracking all say Obama is leading McCain by 5 points or more. McCain supporters can be consoled by the fact that Battleground Tracking says McCain is still leading by 2 points. Neither would be happy to hear that trusting the headlines from a “tracking” poll would be about as good an idea as eating a sandwich made in Galveston 17 days ago. The short version of tracking polls is that they are trendy ways for polling groups to get a headline and maybe some attention, but they are simply unproven as empirical analysis. Tracking polls are volatile, depend on rough methodology which tends to oversample some demographics in the interest of speed, and have never been peer-reviewed the way more traditional polling is done. These guys even get cute, and try to put these hash polls together to imitate a real effort to get a sense of the national opinion. And, as I have warned many times before, when a poll does not release its internal data to the public, and conceals any significant part of its methodology and weighting, you should treat the pollster like a Bear Stearns banker. Having said this so long, I hear a lot of people say that they will ignore all polls, while others want to know which poll I think is “best”. That, however, is not so easy to answer.

First off, almost all polls have some validity. With very few exceptions, I do believe that polling groups with more than four years of experience in the business will try to follow a standardized methodology, and I also believe that most polls make a determined effort to avoid bias as much as possible. When I say that a poll has bias, therefore, what I mean is that the poll may be accepted for its findings, provided the reader is aware of the poll’s history and tendencies. If the poll is consistent, you can compensate for bias and get a generally objective idea of the situation. What’s more, if a poll is consistent in its methodology and weightings, then movement by a candidate over time within the same poll can be considered valid to show growing or diminishing strength of support.

So, you don’t care about all the details, you just want to know who’s winning? Good luck with that, the national polls may not be much real help. That’s due to a number of things. First, you should know that the polls taken before the last week of the campaign have no statistical value in predicting the election winner. Also, as I said there is always bias present, so the fact that one poll or many says something does not make it necessarily so (just one reason that both Obama and Mccain have hired private firms to poll for them). And then there is a crucial fact to consider about the election; it’s not one race, actually, it’s fifty-one races to decide the matter. As Mister Gore realized a bit late in the 2000 election, it is the Electoral vote which determines the presidency.

So, as several people have asked, how do the state polls look? Well, that’s a tougher question. First off, you know how I am about transparency in poll internal data, but very few polls make that information public (and some, like Rasmussen, will provide it but only if you pay a fee, which goes against my principles – if you want the publicity of announcing your poll results to the media, you have the moral duty to provide all the supporting data). A significant exception is Survey USA, which is the best of the state polls in following clear NCPP rules. Also, I should warn the reader that in 2004, a lot of state polls were well off the mark. During the last month of the 2004 election, for example, I remember some polls which gave President Bush leads in Michigan and Oregon, and Kerry leads in West Virginia and Florida, none of which turned out to be correct in the actual election. State polls, it should be understood, are smaller budget than the national polls, and often have a much smaller base, which creates a much greater margin of error.

No comments: