• Pascal's Chatbot Q&As
  • Posts
  • These reports present a meticulously researched and data-driven analysis of how Elon Musk’s social media platform, X, systematically amplifies right-wing and extreme political content in the UK...

These reports present a meticulously researched and data-driven analysis of how Elon Musk’s social media platform, X, systematically amplifies right-wing and extreme political content in the UK...

...raising urgent questions about algorithmic governance, democratic influence, and the unchecked power of private digital infrastructure over public discourse.

The X Effect – How Elon Musk’s Platform is Shaping British Political Discourse and Society

by ChatGPT-5

In November 2025, Sky News published a landmark investigation titled The X Effect: How the World’s Richest Man is Boosting the British Right, accompanied by a methodological reportHow Sky News Investigated X’s Algorithm for Political Bias

Together, these reports present a meticulously researched and data-driven analysis of how Elon Musk’s social media platform, X (formerly Twitter), systematically amplifies right-wing and extreme political content in the United Kingdom—raising urgent questions about algorithmic governance, democratic influence, and the unchecked power of private digital infrastructure over public discourse.

1. The Investigation: Methods and Findings

Over nine months, Sky News’ Data and Forensics team created nine “clean” test accounts representing left-leaning, right-leaning, and neutral British users. These accounts were designed without prior engagement history, allowing researchers to observe X’s algorithmic behavior in isolation. Over two weeks in May 2025, the team collected nearly 90,000 posts from 22,000 accounts. Using Google’s Gemini 2.5 Flash large language model, supervised by academic experts from Oxford, Maryland, and Stanford, Sky News classified the political orientation and extremity of content based on linguistic and thematic patterns.

The results were stark:

  • Over 60% of political content shown to new users came from right-wing accounts, while only 32% came from the left.

  • Neutral users were exposed to twice as much right-wing content as left-wing.

  • More than half of all political posts originated from accounts using extreme or hateful language, 72% of which were right-leaning.

  • Even users who followed only left-wing accounts were shown substantial amounts of right-wing content.

This bias was not a reflection of user choice or post frequency. Some politicians, like right-wing MP Rupert Lowe, were overrepresented fourfold, while left-wing figures such as George Galloway were underrepresented despite higher engagement and follower counts. The findings, corroborated by external analysts, suggest that X’s algorithm structurally privileges right-wing and polarizing content—an outcome likely influenced by leadership priorities and engagement-driven design.

2. Elon Musk’s Direct Role in Shaping Political Narratives

The investigation situates Musk not merely as a platform owner but as a political actor in UK discourse. Over the past year, Musk has publicly endorsed anti-immigration rhetoric, compared migrants to Tolkien’s monsters, and appeared virtually at rallies organized by Tommy Robinson. His interventions—amplified by X’s algorithm—have boosted fringe political movements such as Advance UK and Restore Britain, both linked to former Reform UK politicians.

Sky News demonstrated empirically that when Musk interacted with a politician’s post, engagement increased fivefold. These endorsements have translated into real-world consequences: membership spikes, media visibility, and shifting narratives in online political communities. Former Twitter executive Bruce Daisley summarized the dynamic: “Musk bought Twitter to have influence in American politics… the product can do the same here.”

3. Algorithmic Amplification and the Collapse of Neutrality

X’s algorithm—ostensibly open-sourced on GitHub—remains only partially transparent. Large portions are redacted or modified, obscuring how amplification decisions are made. The engagement-based design, prioritizing emotionally charged posts and videos, inherently rewards divisive and sensational content. Since Musk’s takeover, the dismissal of 80% of Twitter’s workforce, including its curation and trust-and-safety teams, has further degraded quality control. What once was a “town square” is now, as Sky News concludes, “private property where the loudest voices belong to those with power.”

The findings expose a deeper structural issue: algorithmic bias as a form of soft power. Platforms claim to be neutral intermediaries, yet their code operationalizes the values, politics, and financial incentives of their owners. Musk’s invocation of “free speech absolutism” thus functions less as a defense of open discourse and more as a cover for selective amplification that aligns with his personal worldview.

4. Political and Regulatory Responses

UK political leaders have begun to take notice. Labour’s Ed Miliband called Musk “a threat to British society and democracy,” while Liberal Democrat leader Ed Davey warned that the country is “letting in the world’s richest man to undermine our democracy.” Yet, as Ofcom clarified, the UK’s Online Safety Act does not mandate political balance and limits its scope to illegal material—meaning algorithmic bias remains beyond current regulatory reach.

This regulatory gap places democratic societies in a paradox: governments cannot easily intervene without risking censorship, while laissez-faire digital governance enables private platforms to distort political realities with impunity. The challenge is compounded by Musk’s global reach—he exerts political influence in countries where he holds no electoral accountability or jurisdictional oversight.

5. Implications for Users and Society

For ordinary users, the implications are profound:

  • Perceptual manipulation: Individuals experience a curated version of “public opinion” that may not reflect reality, reinforcing confirmation bias and political polarization.

  • Information asymmetry: Fringe movements can appear mainstream through algorithmic amplification, while moderate or diverse perspectives are buried.

  • Platform dependence: With X remaining a central hub for journalists, politicians, and activists, its biases can distort democratic dialogue at scale.

Societally, the situation underscores a broader transition: from public discourse mediated by institutions to narrative control mediated by algorithms. When engagement becomes the metric of truth, democracies risk evolving into attention economies governed by opaque technical systems owned by billionaires.

6. Conclusion: The Privatization of the Public Sphere

Sky News’ investigation is both a case study and a warning. It demonstrates empirically that X’s algorithm under Musk’s leadership amplifies right-wing and extremist voices, reshaping Britain’s digital public sphere in ways that mirror prior U.S. polarization patterns. More troublingly, it reveals that algorithmic design choices—once invisible technical parameters—now carry geopolitical weight.

For users, the lesson is vigilance: algorithmic feeds are not reflections of reality but constructed mirrors of power. For society, the question is urgent: how do democracies preserve free speech and pluralism when the infrastructure of speech itself is privately owned and politically active?

As the report concludes, finding a fix that satisfies both free expression and democratic integrity may be “mission impossible.” Yet without such a fix, the boundaries between public debate, private influence, and algorithmic manipulation will continue to erode—turning platforms like X from forums of conversation into instruments of control.