Abuse thrives where the government tweets – why it's time to quit X now
The use of AI tool Grok to generate en masse intimate images of real women and girls is sickening.
Need another reason to leave X, formerly known as Twitter?
If the Great Replacement conspiracy theories and the constant barrage of personal abuse weren’t enough, this might tip you over the edge. Elon Musk’s AI tool Grok is being used – en masse – to generate intimate images of real women and girls, some under 18, digitally undressed and shared online. Victims include celebrities, teenagers, and everyday female users – and despite the uproar, and mealy-mouthed words from Grok and Musk, the practice appears to be continuing, even as I write this.
In a sentence that feels grimly predictable to write: this mass proliferation of sex abuse material has been framed as a technical failure. Loyal to the Silicon Valley “move fast and break things” ethos, Grok treated it as a lapse in safeguards, a moderation oversight, something that could be fixed with a technical patch. But this misses the point. This not some malfunction, but true to the platform’s purpose: X is a mechanism that strips away our humanity and amplifies society’s worst instincts.
Racism, misogyny, and sexualised cruelty were not invented by social media, but they are increasingly defined by it. Anonymity, algorithms, and consequence-free posting do more than allow harmful attitudes to surface; they reward their performance. Users are repeatedly shown they are not alone in their worst impulses. Transgression brings attention and community, while moral restraint carries little reward.
This is what happens when the guardrails of public discourse are removed. X has not expanded democratic debate; it has narrowed it, directing energy toward what provokes, shocks, and degrades most efficiently. That an AI tool on one of the world’s most influential platforms can generate sexualised images of women and children at scale is not incidental, but the predictable outcome of a culture that normalises humiliation as entertainment and blurs the line between speech and abuse.
More troubling is the government’s response – or lack thereof. In typical fashion, Ofcom said it had made “urgent contact” with X and xAI to understand what steps had been taken to comply with UK legal duties. Toothless, perhaps, but still stronger than Labour frontbencher Baroness Anderson, who claimed X was “still an appropriate platform” for the government to use, because she “believes [s] in freedom of speech and freedom of expression”.
Here’s the issue: Musk is not a neutral tech entrepreneur. He is a political actor who has repeatedly used X to target opponents and amplify conspiratorial narratives. His false claim that safeguarding minister Jess Phillips was an apologist for sexual violence was not abstract provocation; it had real-world consequences.
Then there’s the “free speech claim”, which feels particularly insulting in light of the government’s recent commitment to halve violence against women and girls. The mass creation of sexualised abuse material of women and girls is not incidental; it is an indication of what this site wants. X is structured to humiliate and bully out marginalised voices. This is how white supremacy operates in practice. So it is less a platform for free speech than a stage for white male voices to dominate.
Yet, government departments continue to use X as an official communications channel. Defended as pragmatism – “the platform is where audiences are” – in practice, it legitimises and empowers an individual whose interests are thoroughly misaligned with ours. Musk does not want trust, accuracy, or restraint. Division is central to his modus operandi.
The government has a painful blind spot. As I’ve written before, the most consequential radicalisers in contemporary Britain are not adolescent boys reached through classroom interventions, but older men who already hold power – in politics, media, and technology. They shape the environments in which misogyny and supremacy fester. Ignoring these structures while focusing solely on early intervention – as per the VAWG strategy – is avoidance. The Grok episode illustrates this failure clearly: a platform that facilitates sexual degradation at scale is treated as an unfortunate but unavoidable feature of modern communication. Its radicalising effect is minimised, and responsibility displaced.
If the government is serious about free speech, tackling violence against women and girls, and countering extremism, it simply cannot continue to treat X as a neutral utility. It is not; it is far, far darker. It is an environment that shapes behaviour, rewards cruelty, and normalises abuse.
The question is no longer whether X can be made safer through incremental measures, but whether a liberal, democratic society can continue to rely on a platform whose incentives run so directly counter to its stated values – and whether those in power are willing to confront that reality. ■
About the author: Zoë Grünewald is Westminster Editor at The Lead and a freelance political journalist and broadcaster. Zoë then worked as a policy and politics reporter at the New Statesman, before joining the Independent as a political correspondent. When not writing about politics and policy, she is a regular commentator on TV and radio and a panellist on the Oh God What Now podcast.
👫Agree with Zoe? Share this story with your friends, family and colleagues to help us reach more people with our independent journalism, always with a focus on people, policy and place.
And our January sale is now live too, you can get 26 per cent off an annual subscription to The Lead for full access and a way to support our independent progressive journalism too. It means it’s exactly £36.26 for the year, which is equivalent of £3.02 per month, instead of £49. Bargain!




Refuse to touch anything owned by Elon Musk. He is an appalling human being
Left when it was still called Twitter