It seems like selection effects explain everything here. X is unique among text-based social media platforms. Musk planted his banner and blew his horn to attract conservatives while a large fraction of Trump opponents left Twitter for Mastadon and Bluesky.
The macro empirical evidence (different results on TikTok) and more sophisticated, holistic theoretical models suggest that "media mind control" is not very convincing , nor predictive. (Dan Williams' "Scapegoating the Algorithm" is a very good introduction.*) As you acknowledge, "people increasingly gravitate toward news sources that fit their worldview". Combine that with various tribal signaling games and other social dynamics, and there's nothing significant left for the Northwestern study (for example) to explain.
More thoughts follow. (Or maybe the same thoughts in a more considered package.)
My mental model is that the behavior we see on Twitter is mostly an artifact of selection and incentives rather than a driver of real-world political behavior. It's a classic example of range restriction, draws a narrow slice of the population to begin with, and then further amplifies the loudest, most ideological, and most attention-seeking participants. Add the competitive dynamics of virality and the constant pressure for political signaling, and you get a stage for partisan peacocking that doubles as a status game.
That combination seems sufficient to explain most of the discrepancies between what appears dominant on Twitter and what shows up in surveys, elections, or everyday social life.
If that explanation is right, then “Twitter discourse” should mostly be noise: exaggerated, polarized, and unrepresentative. What would change my mind is credible evidence that the platform has meaningful causal effects on ordinary or marginal participants, not just the deeply committed political performers.
In particular, I’d want studies that look beyond the most active users and focus on people at the edges: outside tight ideological bubbles, casual users, or people newly exposed to the platform. Do their attitudes systematically shift after sustained exposure? Do they become more polarized, more extreme, or meaningfully different from comparable non-users? And can those shifts be distinguished from ordinary persuasion or the simple effect of encountering arguments or information they hadn’t previously seen?
Absent that kind of evidence, the simpler explanation still seems more plausible: X is mostly a distorted stage on which a small group performs, not a driver of political attitudes.
While the evidence for algorithms changing mass groups of politics (macro) is somewhat thin, we can observe this in micro empirically. There is also the way we measure and frame it. Sure you might have gone in leftwing or rightwing, algorithmic content suggestion can you make further to the left or right which is different from turning leftwingers into rightwingers and visa versa.
One example I would point to is that if you are often critical of progressives, the algorithm will pick up on that and surface content that critiques progressives leading to a feedback loop of "can you believe what these CRAZY leftists are doing now?" which is going to narrow your view of the world. This I think is partially why a lot of tech people that got annoyed by woke fell down the rabbit hole of rightwing internet content.
What I've noticed in the past 10 years is a solidifying of talking points and what the right opinion is to have, at least on the left. (I'm sure this happens on the right as well). Defund the police, transwomen are women, Gaza is a genocide, etc. It's not that there is not merit to these statements, but it is a little worrying that we rapidly come up with these positions you must affirm or be considered MAGA. I don't remember that happening to the same extent in the 90s or even early 2000s.
This has been an accelerating dynamic since the literal invention of the printing press. Before then if you wanted someone to know something that wasn't in your village you had to walk there and tell them about it, which took a long time! information spread p slowly.
Then after the printing press you didn't need to write it yourself, you could mass print a bunch of pamphlets and put them on a cart and take it to the next town over. This is what helped spread the reformation throughout Europe as people could actually engage in a debate with one another because they actually had a mechanism with which to learn what other people thought about things! Before you wouldn't even know others disagreed because how would they tell you?
Then as printing technology improved the pace of cultural change and discourse accelerated. The next big jump was the telegraph. Now you could send messages across vast distances! We could find out there was fighting happening in the new midwestern states over slavery without waiting for letters or people to make their way east.
Radio after that was a titanic upheaval in communication. The european political crises of the early 20th century could not take the form they did without radio. It was vital to spreading the message of every new ideology springing up throughout europe like mushrooms after a rainy night. Again the discourse is accelerating. In spain you have broadcasts where every faction of the civil war is getting their news and talking points from radio.
That is all to say, we think and communicate so much faster these days. So consensus being reached quicker is obviously downstream of that!
Okay yeah this explanation feels very plausible to me.
To borrow the lingo from causalml stuff, I think ATE of twitter is prob not that high. However if you look at CATE, I think you can see some wild values here and there and tech ppl are prob epitome of it.
Elon is prob the prime example of it - like I think he was clearly annoyed by woke and was pretty transphobic but I don’t recall him to be that overt Nazi but now he’s basically retweeting all those straight up Nazi stuffs while he shouts conspiracy theory about persecution of South African whites…
To me the most disappointing thing about how a lot of tech people have conducted themselves as of late is they used to crow a lot about how open minded they are, and then when challenged culturally they reacted in a way that is definitionally the opposite of open minded
Like it’s kinda weird they basically get lean hard into confirmation bias and audience capture.
I guess one thing I kinda think about is I don’t feel like I see fresh face in those tech giants these days - like Elon, Zuck and Andressen and Peter Thiel have been around for almost two decades at this point. Peter Thiel feels like some kinda right wing weirdo from the get go and Marc Andersen was never progressive/liberal etc (but not this Too Online) but part of me feels like as they keep being in power for too long, they themselves become very defensive/conservative although they will never acknowledge it.
For good or ill at least on Twitter you see the conflict. Reels and instagram the filter bubble is kind of so well done that even if you’re not looking for politics you get a kind of ambient current. Twitter always drives me to conflict. Even like I want posts about Chappell Roan and Taylor Swift are just a trench war.
Like my TikTok is mostly about teaching and vegan food and pop music and theater and so of course it’s insanely leftwing. So like the idea that right wing ideas do okay on reels and TikTok is like conceptually easy to grasp but hard to see in real time.
This makes a lot of sense and no wonder the current admin acts like their feedback loop is completely broken.
I think a lot of ppl interpret this as “they dgaf at popularity and this is a sign that they are authoritarian” - while I agree the current admin is clearly wannabe authoritarian but I think in their head they are doing what they think is popular and their feedback loop is completely broken.
And “treatment effect” of Twitter platform is very plausible - if anything I feel like Elon himself (and maybe Marc Andressen etc?) is the epitome of it.
Like when he took over X, I don’t recall Elon to be this outright Nazi or white supremacist although he was very transphobic but as he essentially turned X into 4ch pol or Daily Stormer etc, his brained was also poisoned in that direction too…
Twitter used to be a broader spectrum of the population but X is owned by an openly Nazi guy who made AI features that strip women's & childrens' clothes out of pictures and work anti-semitism into conversations. Saying 'turns out the people still using X skew right' is like saying 'turns out people on Truch Social are mostly MAGA'. Yeah, no kidding.
“the algorithm’s pro-Republican bias is evident to anyone who spends even a couple of minutes on the platform.” — ok so I’m not an X power user by any stretch, but I am on it occasionally, and I can’t say I’ve noticed this. Since it seems so obvious to you, I’m curious, what does it look like for you?
Very cool data set! One thing I always want to know about these finely sliced subgroups is what the error bars look like. I wish we had data on Twitter pre-Elon, so we could tease out causation from algorithmic meddling.
Thanks! The error bars for the aggregate surveys are smaller than you might think. We have a lot of data here; even the smallest cohort (Reddit) still has over n=1,000.
For the January national survey, it's obviously a bit larger. But still informative and the numbers line up very well with the aggregate polls.
I use X and am pro-Trump since the Venezuela backlash but I mostly enjoy arguing with the Chinese trolls who say Epstein’s existence is why Russia is justified in invading Ukraine. I get my news “elsewhere.”
Anne Applebaum simultaneously criticizing the operation for being too invasive and not invasive enough did more to shift my support towards Trump than anything a pro-bias could do.
Periodically, you will see that the percentage of adults who use X daily is very low. I'm seeing about 25% and then less than half say they use it daily. And then I think we have all seen less than 10% of users are responsible for more than 90% of posts.
Does it matter that more than 50% of a small section of US adults get their information from X? It still seems to have a weird pull for people who are on it and people in media. Meaning they love to talk about how terrible it is now.
Valuable work; thank you. It should remind Argument readers that we are an analytical elite.
Surely we could compare republicans on reddit to republicans on twitter? Is the effect still there?
yes. let's go a bit more general, actually:
Trump 2024/on X: Approve +74
Trump 2024/no X: Approve +69
Harris 2024/on X: Disapprove +81
Harris 2024/no X: Disapprove +92
The fact that there's even a ~10 point effect of X _among Harris voters_ is wild.
Thank you. Do we have enough to control for age?
It seems like selection effects explain everything here. X is unique among text-based social media platforms. Musk planted his banner and blew his horn to attract conservatives while a large fraction of Trump opponents left Twitter for Mastadon and Bluesky.
The macro empirical evidence (different results on TikTok) and more sophisticated, holistic theoretical models suggest that "media mind control" is not very convincing , nor predictive. (Dan Williams' "Scapegoating the Algorithm" is a very good introduction.*) As you acknowledge, "people increasingly gravitate toward news sources that fit their worldview". Combine that with various tribal signaling games and other social dynamics, and there's nothing significant left for the Northwestern study (for example) to explain.
*https://asteriskmag.com/issues/11/scapegoating-the-algorithm
More thoughts follow. (Or maybe the same thoughts in a more considered package.)
My mental model is that the behavior we see on Twitter is mostly an artifact of selection and incentives rather than a driver of real-world political behavior. It's a classic example of range restriction, draws a narrow slice of the population to begin with, and then further amplifies the loudest, most ideological, and most attention-seeking participants. Add the competitive dynamics of virality and the constant pressure for political signaling, and you get a stage for partisan peacocking that doubles as a status game.
That combination seems sufficient to explain most of the discrepancies between what appears dominant on Twitter and what shows up in surveys, elections, or everyday social life.
If that explanation is right, then “Twitter discourse” should mostly be noise: exaggerated, polarized, and unrepresentative. What would change my mind is credible evidence that the platform has meaningful causal effects on ordinary or marginal participants, not just the deeply committed political performers.
In particular, I’d want studies that look beyond the most active users and focus on people at the edges: outside tight ideological bubbles, casual users, or people newly exposed to the platform. Do their attitudes systematically shift after sustained exposure? Do they become more polarized, more extreme, or meaningfully different from comparable non-users? And can those shifts be distinguished from ordinary persuasion or the simple effect of encountering arguments or information they hadn’t previously seen?
Absent that kind of evidence, the simpler explanation still seems more plausible: X is mostly a distorted stage on which a small group performs, not a driver of political attitudes.
While the evidence for algorithms changing mass groups of politics (macro) is somewhat thin, we can observe this in micro empirically. There is also the way we measure and frame it. Sure you might have gone in leftwing or rightwing, algorithmic content suggestion can you make further to the left or right which is different from turning leftwingers into rightwingers and visa versa.
One example I would point to is that if you are often critical of progressives, the algorithm will pick up on that and surface content that critiques progressives leading to a feedback loop of "can you believe what these CRAZY leftists are doing now?" which is going to narrow your view of the world. This I think is partially why a lot of tech people that got annoyed by woke fell down the rabbit hole of rightwing internet content.
What I've noticed in the past 10 years is a solidifying of talking points and what the right opinion is to have, at least on the left. (I'm sure this happens on the right as well). Defund the police, transwomen are women, Gaza is a genocide, etc. It's not that there is not merit to these statements, but it is a little worrying that we rapidly come up with these positions you must affirm or be considered MAGA. I don't remember that happening to the same extent in the 90s or even early 2000s.
This has been an accelerating dynamic since the literal invention of the printing press. Before then if you wanted someone to know something that wasn't in your village you had to walk there and tell them about it, which took a long time! information spread p slowly.
Then after the printing press you didn't need to write it yourself, you could mass print a bunch of pamphlets and put them on a cart and take it to the next town over. This is what helped spread the reformation throughout Europe as people could actually engage in a debate with one another because they actually had a mechanism with which to learn what other people thought about things! Before you wouldn't even know others disagreed because how would they tell you?
Then as printing technology improved the pace of cultural change and discourse accelerated. The next big jump was the telegraph. Now you could send messages across vast distances! We could find out there was fighting happening in the new midwestern states over slavery without waiting for letters or people to make their way east.
Radio after that was a titanic upheaval in communication. The european political crises of the early 20th century could not take the form they did without radio. It was vital to spreading the message of every new ideology springing up throughout europe like mushrooms after a rainy night. Again the discourse is accelerating. In spain you have broadcasts where every faction of the civil war is getting their news and talking points from radio.
That is all to say, we think and communicate so much faster these days. So consensus being reached quicker is obviously downstream of that!
Okay yeah this explanation feels very plausible to me.
To borrow the lingo from causalml stuff, I think ATE of twitter is prob not that high. However if you look at CATE, I think you can see some wild values here and there and tech ppl are prob epitome of it.
Elon is prob the prime example of it - like I think he was clearly annoyed by woke and was pretty transphobic but I don’t recall him to be that overt Nazi but now he’s basically retweeting all those straight up Nazi stuffs while he shouts conspiracy theory about persecution of South African whites…
To me the most disappointing thing about how a lot of tech people have conducted themselves as of late is they used to crow a lot about how open minded they are, and then when challenged culturally they reacted in a way that is definitionally the opposite of open minded
Yeah no doubt
Like it’s kinda weird they basically get lean hard into confirmation bias and audience capture.
I guess one thing I kinda think about is I don’t feel like I see fresh face in those tech giants these days - like Elon, Zuck and Andressen and Peter Thiel have been around for almost two decades at this point. Peter Thiel feels like some kinda right wing weirdo from the get go and Marc Andersen was never progressive/liberal etc (but not this Too Online) but part of me feels like as they keep being in power for too long, they themselves become very defensive/conservative although they will never acknowledge it.
For good or ill at least on Twitter you see the conflict. Reels and instagram the filter bubble is kind of so well done that even if you’re not looking for politics you get a kind of ambient current. Twitter always drives me to conflict. Even like I want posts about Chappell Roan and Taylor Swift are just a trench war.
Like my TikTok is mostly about teaching and vegan food and pop music and theater and so of course it’s insanely leftwing. So like the idea that right wing ideas do okay on reels and TikTok is like conceptually easy to grasp but hard to see in real time.
This makes a lot of sense and no wonder the current admin acts like their feedback loop is completely broken.
I think a lot of ppl interpret this as “they dgaf at popularity and this is a sign that they are authoritarian” - while I agree the current admin is clearly wannabe authoritarian but I think in their head they are doing what they think is popular and their feedback loop is completely broken.
And “treatment effect” of Twitter platform is very plausible - if anything I feel like Elon himself (and maybe Marc Andressen etc?) is the epitome of it.
Like when he took over X, I don’t recall Elon to be this outright Nazi or white supremacist although he was very transphobic but as he essentially turned X into 4ch pol or Daily Stormer etc, his brained was also poisoned in that direction too…
Twitter used to be a broader spectrum of the population but X is owned by an openly Nazi guy who made AI features that strip women's & childrens' clothes out of pictures and work anti-semitism into conversations. Saying 'turns out the people still using X skew right' is like saying 'turns out people on Truch Social are mostly MAGA'. Yeah, no kidding.
“the algorithm’s pro-Republican bias is evident to anyone who spends even a couple of minutes on the platform.” — ok so I’m not an X power user by any stretch, but I am on it occasionally, and I can’t say I’ve noticed this. Since it seems so obvious to you, I’m curious, what does it look like for you?
Very cool data set! One thing I always want to know about these finely sliced subgroups is what the error bars look like. I wish we had data on Twitter pre-Elon, so we could tease out causation from algorithmic meddling.
Thanks! The error bars for the aggregate surveys are smaller than you might think. We have a lot of data here; even the smallest cohort (Reddit) still has over n=1,000.
For the January national survey, it's obviously a bit larger. But still informative and the numbers line up very well with the aggregate polls.
I use X and am pro-Trump since the Venezuela backlash but I mostly enjoy arguing with the Chinese trolls who say Epstein’s existence is why Russia is justified in invading Ukraine. I get my news “elsewhere.”
Anne Applebaum simultaneously criticizing the operation for being too invasive and not invasive enough did more to shift my support towards Trump than anything a pro-bias could do.
Seeing Bernie argue against it purely on “legality” is the last straw. I will never vote again. I identify as Republican. What have I become?
Periodically, you will see that the percentage of adults who use X daily is very low. I'm seeing about 25% and then less than half say they use it daily. And then I think we have all seen less than 10% of users are responsible for more than 90% of posts.
Does it matter that more than 50% of a small section of US adults get their information from X? It still seems to have a weird pull for people who are on it and people in media. Meaning they love to talk about how terrible it is now.
This question actually asked people where they regularly get their news from, rather than where they post from; around 20% said X.