We'll announce who's reviewed each poll at release, so as to properly credit people for stuff they did and not attribute things to them that they didn't have involvement in :)
As someone who works primarily in survey research, I always like to draw a distinction between survey research (utilizing quantitative methods to learn more about the views and experiences of populations) and polling (figuring out how people will vote on a given issue or politician). I'm glad this sounds more like survey research, because for polling I'm always forced to ask... Why does anyone care?
If you're not a political consultant, polling is sometimes wrong but always useless. Individuals shouldn't (yes this is a moral judgment) care deeply about the popularity of most issues. Focus on right/wrong, effective/ineffective, closer to/further from your desired state. Most discussions about poling are pure political hobbyism. As a devoted political hobbiest, I am confident about that. You don't need to message test your Twitter.
I'm very hopeful for a project that sounds more attuned to understanding the world than to giving us another box score on our favorite TV characters.
Question. It felt to me that there was a decade-long consensus that Obama won on a new coalition that de-emphasized the WWC. I feel like it wasn’t until Trump won that people started saying that the new coalition was largely a myth and that he’d done much better with the WWC than previously understood.
Everyone says there’s a significant lag (generally years) between understanding how people voted in the prior election, and that exit polls are bad, but exit polls end up driving the narrative. How much is this happening now?
Also, looking at the link to Nate’s blog here, it appears that the best pollsters this cycle were all hardcore rightwing partisan outfits. Were they actually better, or did their systematic error just happen fall in the right direction? IOW, should we be looking at them for 2026 and beyond? How did they do in previous midterms?
I do think this is much better now that election data analysis is more sophisticated. The types of inferences being drawn today from data are light-years ahead of what 2012 brought, when there was literally Nate Silver and nobody else.
With respect to the pollsters question...it's mostly the latter, in that their systematic bias tended to serve them well this cycle. These methodologies crashed hard in 2022, and I haven't seen proof they translate consistently across years.
In general, this tends to vary by vendor. We're contractually not allowed to share the details of the agreements, but you can think of $10,000 per survey as a reasonable estimate for something of this level, if you want to be good with sampling and do something online instead of live-caller/live-text (those two are way more expensive).
Yes. Campaigns don't *just* rely on surveys, but they use it a lot to inform things like messaging and spending. It's really important because it shapes the tenor of your campaign strategy. For instance, if you are noticing weakness among Black voters, you want to catch that early and take concrete steps to address it.
So glad you’re doing this. Has the outside review board been named?
We'll announce who's reviewed each poll at release, so as to properly credit people for stuff they did and not attribute things to them that they didn't have involvement in :)
As someone who works primarily in survey research, I always like to draw a distinction between survey research (utilizing quantitative methods to learn more about the views and experiences of populations) and polling (figuring out how people will vote on a given issue or politician). I'm glad this sounds more like survey research, because for polling I'm always forced to ask... Why does anyone care?
If you're not a political consultant, polling is sometimes wrong but always useless. Individuals shouldn't (yes this is a moral judgment) care deeply about the popularity of most issues. Focus on right/wrong, effective/ineffective, closer to/further from your desired state. Most discussions about poling are pure political hobbyism. As a devoted political hobbiest, I am confident about that. You don't need to message test your Twitter.
I'm very hopeful for a project that sounds more attuned to understanding the world than to giving us another box score on our favorite TV characters.
Can also recommend Lakshya’s recent chat with Nate Silver: https://www.natesilver.net/p/real-talk-on-models-moderation-and
Lovely insights from Lakshya regarding the necessity of polling.
Question. It felt to me that there was a decade-long consensus that Obama won on a new coalition that de-emphasized the WWC. I feel like it wasn’t until Trump won that people started saying that the new coalition was largely a myth and that he’d done much better with the WWC than previously understood.
Everyone says there’s a significant lag (generally years) between understanding how people voted in the prior election, and that exit polls are bad, but exit polls end up driving the narrative. How much is this happening now?
Also, looking at the link to Nate’s blog here, it appears that the best pollsters this cycle were all hardcore rightwing partisan outfits. Were they actually better, or did their systematic error just happen fall in the right direction? IOW, should we be looking at them for 2026 and beyond? How did they do in previous midterms?
I do think this is much better now that election data analysis is more sophisticated. The types of inferences being drawn today from data are light-years ahead of what 2012 brought, when there was literally Nate Silver and nobody else.
With respect to the pollsters question...it's mostly the latter, in that their systematic bias tended to serve them well this cycle. These methodologies crashed hard in 2022, and I haven't seen proof they translate consistently across years.
Curious what's the running cost of fielding a monthly survey/poll of this size?
In general, this tends to vary by vendor. We're contractually not allowed to share the details of the agreements, but you can think of $10,000 per survey as a reasonable estimate for something of this level, if you want to be good with sampling and do something online instead of live-caller/live-text (those two are way more expensive).
Polling is fine but do people tell the truth and the method is the key. But really is polling important to voters or campaigns?
Yes. Campaigns don't *just* rely on surveys, but they use it a lot to inform things like messaging and spending. It's really important because it shapes the tenor of your campaign strategy. For instance, if you are noticing weakness among Black voters, you want to catch that early and take concrete steps to address it.
What does it mean to average voters who do not watch cable entertainment/news?