It seems like selection effects explain everything here. X is unique among text-based social media platforms. Musk planted his banner and blew his horn to attract conservatives while a large fraction of Trump opponents left Twitter for Mastadon and Bluesky.
The macro empirical evidence (different results on TikTok) and more sophisticated, holistic theoretical models suggest that "media mind control" is not very convincing , nor predictive. (Dan Williams' "Scapegoating the Algorithm" is a very good introduction.*) As you acknowledge, "people increasingly gravitate toward news sources that fit their worldview". Combine that with various tribal signaling games and other social dynamics, and there's nothing significant left for the Northwestern study (for example) to explain.
While the evidence for algorithms changing mass groups of politics (macro) is somewhat thin, we can observe this in micro empirically. There is also the way we measure and frame it. Sure you might have gone in leftwing or rightwing, algorithmic content suggestion can you make further to the left or right which is different from turning leftwingers into rightwingers and visa versa.
One example I would point to is that if you are often critical of progressives, the algorithm will pick up on that and surface content that critiques progressives leading to a feedback loop of "can you believe what these CRAZY leftists are doing now?" which is going to narrow your view of the world. This I think is partially why a lot of tech people that got annoyed by woke fell down the rabbit hole of rightwing internet content.
What I've noticed in the past 10 years is a solidifying of talking points and what the right opinion is to have, at least on the left. (I'm sure this happens on the right as well). Defund the police, transwomen are women, Gaza is a genocide, etc. It's not that there is not merit to these statements, but it is a little worrying that we rapidly come up with these positions you must affirm or be considered MAGA. I don't remember that happening to the same extent in the 90s or even early 2000s.
Okay yeah this explanation feels very plausible to me.
To borrow the lingo from causalml stuff, I think ATE of twitter is prob not that high. However if you look at CATE, I think you can see some wild values here and there and tech ppl are prob epitome of it.
Elon is prob the prime example of it - like I think he was clearly annoyed by woke and was pretty transphobic but I don’t recall him to be that overt Nazi but now he’s basically retweeting all those straight up Nazi stuffs while he shouts conspiracy theory about persecution of South African whites…
More thoughts.follow. (Or maybe the same thoughts in a more considered package.)
My mental model is that the behavior we see on Twitter is mostly an artifact of selection and incentives rather than a driver of real-world political behavior. It's a classic example of range restriction, draws a narrow slice of the population to begin with, and then further amplifies the loudest, most ideological, and most attention-seeking participants. Add the competitive dynamics of virality and the constant pressure for political signaling, and you get a stage for partisan peacocking that doubles as a status game.
That combination seems sufficient to explain most of the discrepancies between what appears dominant on Twitter and what shows up in surveys, elections, or everyday social life.
If that explanation is right, then “Twitter discourse” should mostly be noise: exaggerated, polarized, and unrepresentative. What would change my mind is credible evidence that the platform has meaningful causal effects on ordinary or marginal participants, not just the deeply committed political performers.
In particular, I’d want studies that look beyond the most active users and focus on people at the edges: outside tight ideological bubbles, casual users, or people newly exposed to the platform. Do their attitudes systematically shift after sustained exposure? Do they become more polarized, more extreme, or meaningfully different from comparable non-users? And can those shifts be distinguished from ordinary persuasion or the simple effect of encountering arguments or information they hadn’t previously seen?
Absent that kind of evidence, the simpler explanation still seems more plausible: X is mostly a distorted stage on which a small group performs, not a driver of political attitudes.
For good or ill at least on Twitter you see the conflict. Reels and instagram the filter bubble is kind of so well done that even if you’re not looking for politics you get a kind of ambient current. Twitter always drives me to conflict. Even like I want posts about Chappell Roan and Taylor Swift are just a trench war.
Like my TikTok is mostly about teaching and vegan food and pop music and theater and so of course it’s insanely leftwing. So like the idea that right wing ideas do okay on reels and TikTok is like conceptually easy to grasp but hard to see in real time.
This makes a lot of sense and no wonder the current admin acts like their feedback loop is completely broken.
I think a lot of ppl interpret this as “they dgaf at popularity and this is a sign that they are authoritarian” - while I agree the current admin is clearly wannabe authoritarian but I think in their head they are doing what they think is popular and their feedback loop is completely broken.
And “treatment effect” of Twitter platform is very plausible - if anything I feel like Elon himself (and maybe Marc Andressen etc?) is the epitome of it.
Like when he took over X, I don’t recall Elon to be this outright Nazi or white supremacist although he was very transphobic but as he essentially turned X into 4ch pol or Daily Stormer etc, his brained was also poisoned in that direction too…
Twitter used to be a broader spectrum of the population but X is owned by an openly Nazi guy who made AI features that strip women's & childrens' clothes out of pictures and work anti-semitism into conversations. Saying 'turns out the people still using X skew right' is like saying 'turns out people on Truch Social are mostly MAGA'. Yeah, no kidding.
Very cool data set! One thing I always want to know about these finely sliced subgroups is what the error bars look like. I wish we had data on Twitter pre-Elon, so we could tease out causation from algorithmic meddling.
I use X and am pro-Trump since the Venezuela backlash but I mostly enjoy arguing with the Chinese trolls who say Epstein’s existence is why Russia is justified in invading Ukraine. I get my news “elsewhere.”
Anne Applebaum simultaneously criticizing the operation for being too invasive and not invasive enough did more to shift my support towards Trump than anything a pro-bias could do.
Periodically, you will see that the percentage of adults who use X daily is very low. I'm seeing about 25% and then less than half say they use it daily. And then I think we have all seen less than 10% of users are responsible for more than 90% of posts.
Does it matter that more than 50% of a small section of US adults get their information from X? It still seems to have a weird pull for people who are on it and people in media. Meaning they love to talk about how terrible it is now.
Valuable work; thank you. It should remind Argument readers that we are an analytical elite.
Surely we could compare republicans on reddit to republicans on twitter? Is the effect still there?
yes. let's go a bit more general, actually:
Trump 2024/on X: Approve +74
Trump 2024/no X: Approve +69
Harris 2024/on X: Disapprove +81
Harris 2024/no X: Disapprove +92
Thank you. Do we have enough to control for age?
The fact that there's even a ~10 point effect of X _among Harris voters_ is wild.
It seems like selection effects explain everything here. X is unique among text-based social media platforms. Musk planted his banner and blew his horn to attract conservatives while a large fraction of Trump opponents left Twitter for Mastadon and Bluesky.
The macro empirical evidence (different results on TikTok) and more sophisticated, holistic theoretical models suggest that "media mind control" is not very convincing , nor predictive. (Dan Williams' "Scapegoating the Algorithm" is a very good introduction.*) As you acknowledge, "people increasingly gravitate toward news sources that fit their worldview". Combine that with various tribal signaling games and other social dynamics, and there's nothing significant left for the Northwestern study (for example) to explain.
*https://asteriskmag.com/issues/11/scapegoating-the-algorithm
While the evidence for algorithms changing mass groups of politics (macro) is somewhat thin, we can observe this in micro empirically. There is also the way we measure and frame it. Sure you might have gone in leftwing or rightwing, algorithmic content suggestion can you make further to the left or right which is different from turning leftwingers into rightwingers and visa versa.
One example I would point to is that if you are often critical of progressives, the algorithm will pick up on that and surface content that critiques progressives leading to a feedback loop of "can you believe what these CRAZY leftists are doing now?" which is going to narrow your view of the world. This I think is partially why a lot of tech people that got annoyed by woke fell down the rabbit hole of rightwing internet content.
What I've noticed in the past 10 years is a solidifying of talking points and what the right opinion is to have, at least on the left. (I'm sure this happens on the right as well). Defund the police, transwomen are women, Gaza is a genocide, etc. It's not that there is not merit to these statements, but it is a little worrying that we rapidly come up with these positions you must affirm or be considered MAGA. I don't remember that happening to the same extent in the 90s or even early 2000s.
Okay yeah this explanation feels very plausible to me.
To borrow the lingo from causalml stuff, I think ATE of twitter is prob not that high. However if you look at CATE, I think you can see some wild values here and there and tech ppl are prob epitome of it.
Elon is prob the prime example of it - like I think he was clearly annoyed by woke and was pretty transphobic but I don’t recall him to be that overt Nazi but now he’s basically retweeting all those straight up Nazi stuffs while he shouts conspiracy theory about persecution of South African whites…
More thoughts.follow. (Or maybe the same thoughts in a more considered package.)
My mental model is that the behavior we see on Twitter is mostly an artifact of selection and incentives rather than a driver of real-world political behavior. It's a classic example of range restriction, draws a narrow slice of the population to begin with, and then further amplifies the loudest, most ideological, and most attention-seeking participants. Add the competitive dynamics of virality and the constant pressure for political signaling, and you get a stage for partisan peacocking that doubles as a status game.
That combination seems sufficient to explain most of the discrepancies between what appears dominant on Twitter and what shows up in surveys, elections, or everyday social life.
If that explanation is right, then “Twitter discourse” should mostly be noise: exaggerated, polarized, and unrepresentative. What would change my mind is credible evidence that the platform has meaningful causal effects on ordinary or marginal participants, not just the deeply committed political performers.
In particular, I’d want studies that look beyond the most active users and focus on people at the edges: outside tight ideological bubbles, casual users, or people newly exposed to the platform. Do their attitudes systematically shift after sustained exposure? Do they become more polarized, more extreme, or meaningfully different from comparable non-users? And can those shifts be distinguished from ordinary persuasion or the simple effect of encountering arguments or information they hadn’t previously seen?
Absent that kind of evidence, the simpler explanation still seems more plausible: X is mostly a distorted stage on which a small group performs, not a driver of political attitudes.
For good or ill at least on Twitter you see the conflict. Reels and instagram the filter bubble is kind of so well done that even if you’re not looking for politics you get a kind of ambient current. Twitter always drives me to conflict. Even like I want posts about Chappell Roan and Taylor Swift are just a trench war.
Like my TikTok is mostly about teaching and vegan food and pop music and theater and so of course it’s insanely leftwing. So like the idea that right wing ideas do okay on reels and TikTok is like conceptually easy to grasp but hard to see in real time.
This makes a lot of sense and no wonder the current admin acts like their feedback loop is completely broken.
I think a lot of ppl interpret this as “they dgaf at popularity and this is a sign that they are authoritarian” - while I agree the current admin is clearly wannabe authoritarian but I think in their head they are doing what they think is popular and their feedback loop is completely broken.
And “treatment effect” of Twitter platform is very plausible - if anything I feel like Elon himself (and maybe Marc Andressen etc?) is the epitome of it.
Like when he took over X, I don’t recall Elon to be this outright Nazi or white supremacist although he was very transphobic but as he essentially turned X into 4ch pol or Daily Stormer etc, his brained was also poisoned in that direction too…
Twitter used to be a broader spectrum of the population but X is owned by an openly Nazi guy who made AI features that strip women's & childrens' clothes out of pictures and work anti-semitism into conversations. Saying 'turns out the people still using X skew right' is like saying 'turns out people on Truch Social are mostly MAGA'. Yeah, no kidding.
Very cool data set! One thing I always want to know about these finely sliced subgroups is what the error bars look like. I wish we had data on Twitter pre-Elon, so we could tease out causation from algorithmic meddling.
I use X and am pro-Trump since the Venezuela backlash but I mostly enjoy arguing with the Chinese trolls who say Epstein’s existence is why Russia is justified in invading Ukraine. I get my news “elsewhere.”
Anne Applebaum simultaneously criticizing the operation for being too invasive and not invasive enough did more to shift my support towards Trump than anything a pro-bias could do.
Seeing Bernie argue against it purely on “legality” is the last straw. I will never vote again. I identify as Republican. What have I become?
Periodically, you will see that the percentage of adults who use X daily is very low. I'm seeing about 25% and then less than half say they use it daily. And then I think we have all seen less than 10% of users are responsible for more than 90% of posts.
Does it matter that more than 50% of a small section of US adults get their information from X? It still seems to have a weird pull for people who are on it and people in media. Meaning they love to talk about how terrible it is now.
This question actually asked people where they regularly get their news from, rather than where they post from; around 20% said X.