
I am a pollster. But if you’ve lost faith in polling in the Trump era, I can’t really blame you.
Between underestimating Trump’s support in 2016, 2020, and 2024, the cheap Republican-leaning polls that presented a false narrative of what would happen in 2022, the maze of problems present in issue polling, and the continuing decline in response rates, the practice of measuring public opinion has been dealt one blow after another.
It hasn’t helped that pollsters have faced scandals of their own and are often some of the worst culprits when it comes to misusing or misinterpreting data for their own ends.
But while the polling profession may not have covered itself in glory, I’m worried that both researchers and readers have begun to overreact, losing sight of what polls are really good for — especially because it is easy engagement fodder to trash them on social media whenever they miss the mark.
So I’d like to mount a defense. An argument, if you will.
The true value of a survey doesn’t actually lie in the “what,” but rather the “how.” It doesn’t take a genius to see that the nation is closely divided, but massive, cross-cutting shifts are still happening under the hood, and surveys can reveal them to us, even if the top line suggests minimal movement.
For example, in 2024, polling told us that men across America became a lot more Republican, while women continued to hold fairly steady in their support for Democrats. Polling told us that seniors became more Democratic, even while young voters (and, in particular, young men) veered to the right in the Trump era.
What’s more, polling is often the only tool we have in many cases. Election results in heavily Black areas may provide some insight into Black voting patterns, but there’s no equivalent for gender (good luck finding a series of cities that are entirely male or entirely female). Meanwhile, when it comes to age, university campuses are actually incredibly unrepresentative of young voters at large, considering that the majority of people under 30 didn’t even graduate college.
Finally, and perhaps most importantly, knowing how people are voting is only one part of the story. The real value of surveys arguably comes from learning about how different groups feel about issues in general, which helps us understand society at large.
What worries Americans today? What are their feelings on the radical transformations society has undergone of late? How do they view an administration intent on exerting maximal levels of presidential power, with little regard for precedent or permission? Who do they even trust any longer?
It’s impossible to ask all 330 million Americans these questions, so you need a sample that is representative of America as a whole. Surveys aren’t perfect, but they’re the only tools that make an effort to do this.
It’s dangerous to eschew them in favor of your own social experience; our friend groups are mostly unrepresentative, and anecdotes from chance encounters are just as unreliable. Moreover, with the rise of polarization in the age of social media, our social circles have grown less and less ideologically diverse, locking us out of critical insights into how other people really feel.1
That’s why surveys are still used so widely. Even in an age when mistrust abounds, surveys dominate the data we consume and tell us how society is functioning.
Whether you’re looking at exit polls, postelection estimates, consumer sentiment measures, or the jobs report that the Trump administration just declared war on, many of the tools we use to make inferences about society at large are actually underpinned by surveys of some kind. For this reason, conducting them carefully and rigorously is important.
Lots of survey work is unfortunately quite sloppy (or, worse, purposely tortured to reach a result). Issue polling has been twisted and misused by interest groups on both sides of the aisle, resulting in surveys that conflict with basic reality. Abortion bans keep getting voted down in red states like Ohio and Kansas, despite polls suggesting that 15-week limits are politically viable. Ballot referendums suggest that a mass constituency for gun control doesn’t exist at the moment, despite what polls might say.
As powerful as surveys are, they can also be used to massage data and mislead people. Creative question framing and opaque weighting schemes can further skew studies that are already subject to a lot of variance.
That’s where we come in. At The Argument, we’re creating a data project to illustrate how Americans think. Unlike many, our interest isn’t just in telling you how people will vote, but in how they view the world — which also tells you why they vote the way they do. We want to shine a light on how Americans feel about the most pressing issues in our society, without shying away from the good, the bad, or the ugly.
To this end, we’re going to begin a monthly survey project where we poll voters across the nation. Every month, we’ll ask them a set of detailed questions on one specific issue, rather than on a potpourri of different topics, and aim to go deep into how they feel about topics like free speech, AI, and immigration. To properly calibrate it to American politics, we’ll also include a few tracking questions every month, asking voters how they feel about things like the Trump presidency, the Democratic Party, and their voting intentions for November 2026. (For those interested, more technical details are in the footnotes.)2
To make sure that we insulate ourselves from as much of our own biases as possible, we’ve committed to having our questions and survey designs reviewed by a crossideological and independent group of reviewers. We will also be transparent with you about the sampling and methodological choices we make in designing and weighting our surveys, and we will frequently discuss their effects. We hope that doing this will build trust in the surveys that we are conducting, as well as in the way we write about them.
Polling is obviously flawed, but where would we be without it? Judge the value of surveys against the alternative, rather than the Almighty.
Determining what people truly believe is a core demand of democratic systems. Are elected representatives meant to learn about their constituents’ views solely through their conversations with the most-engaged voters? Should policymakers gauge public sentiment by looking at online posts artificially influenced and amplified by social media mobs? Are politicians supposed to assume that college-educated members of activist groups accurately represent the views of entire ethnicities?
Done well, polling is what guards us against motivated reasoning. It’s the type of thing that can insulate us from a tiny, but loud minority with a megaphone dictating the way the world moves. It tells us what people believe, how they plan to vote, and how they view the world. It’s not the truth, but it can get us much closer to it than anything else can.
Over time, the unparalleled richness of our data will allow us to track all kinds of fascinating things regarding voter behavior. Here’s just a small sample of what we have planned for you:
How do voters who get their news from TikTok differ from those who get their news from The New York Times?
Do conservatives have an easier or harder time making friends than liberals do?
Who is most likely to cut off friends and family for differing political viewpoints?
How do voters feel about the rapid rise of AI, and what do they want their government to do about it?
How do young voters feel about America’s foreign policy, and how does it impact their voting behavior?
Who do Americans blame for the housing and affordability crises, and what do they want done about them?
One particular thing we hope to show you is that in a world of great uncertainty and constant change, many voters are also a maze of contradictions. People are quite malleable and fluid on many different topics, and they can have deeply held values while simultaneously changing the way they vote and think in response to current events. (For instance, the driver behind Trump’s 2016 success was winning over a large group of voters who were pro-universal health care and anti-immigration.)
We will have results that clash with the way we view the world, as well as those that provide confirmation for it. At times, the numbers will also provide extremely counterintuitive and conflicting signals that we must make sense of, because people are complex and measuring the way they think is hard.
Such a project is taxing in rigor and methodology. But we’re up for the challenge, and we’re eager to get moving. For our founding members, you will have the opportunity to suggest questions for our upcoming polls — to join at this level, you can go here.
And for those of you who bring up social media as an alternative, I’m not sure that constant exposure to takes from people three standard deviations more insane than society at large is helpful in understanding much.
Our surveys are short — each survey is designed to take respondents six minutes or less, which should help us with low-engagement voters who tend to be hard to reach and hard to retain through long questionnaires. To avoid the “bogus data” problem that plagues many online surveys, we ensure that each respondent is verified against the voter file, check that all phone numbers are U.S.-based and conduct a series of attention and quality checks to ensure that insincere respondents aren’t counted. Our national surveys will run monthly, and we will target a sample size of 1,500 respondents for each survey. We’ll partner with Verasight for fielding and quality checks, but the survey design and weighting will be done entirely by us. (This is also similar to how The New York Times/Siena College partnership works.)
So glad you’re doing this. Has the outside review board been named?
As someone who works primarily in survey research, I always like to draw a distinction between survey research (utilizing quantitative methods to learn more about the views and experiences of populations) and polling (figuring out how people will vote on a given issue or politician). I'm glad this sounds more like survey research, because for polling I'm always forced to ask... Why does anyone care?
If you're not a political consultant, polling is sometimes wrong but always useless. Individuals shouldn't (yes this is a moral judgment) care deeply about the popularity of most issues. Focus on right/wrong, effective/ineffective, closer to/further from your desired state. Most discussions about poling are pure political hobbyism. As a devoted political hobbiest, I am confident about that. You don't need to message test your Twitter.
I'm very hopeful for a project that sounds more attuned to understanding the world than to giving us another box score on our favorite TV characters.